Loading...

In this section we look at a one of the increasingly numerous buzzzwords and catch phrases that inundates those of us that make a business out of technology.

 

Buzzword of the Week

Smart Data

Smart data is digital information that is formatted so it can be acted upon at the collection point before being sent to a downstream analytics platform for further data consolidation and analytics. The term smart data is often associated with the Internet of Things (IoT) and the data that smart sensors embedded in physical objects produce.

The label smart is directly related to a data entry point being intelligent enough to make some types of decisions on incoming data immediately, without requiring processing power from a centralized system. In the past, most analytics was done with batch processing. Data was collected according to schedule, converted to a desired state, put into a database and processed on an hourly, overnight or weekly basis. A drawback of this approach is that by the time the data is analyzed, it's already old. In contrast, smart data analytics programming (also called streaming analytics) monitors data at the source, captures events that are exceptions, assesses them, makes a decision and shares the output -- all within a specific window of time consisting of seconds or fractions of a second.

A self-driving car, for example, can't afford to wait for data to be sent up to the cloud and output to be sent back. It requires data gathered through sensors to be smart, so the data can be immediately analyzed by the automobile's processors and outputs can immediately be sent to actuators that control the car's brakes and steering wheel. If the data is not in a form that can be analyzed as soon as processors receive it, the consequences can be deadly.

Data scientists, business analysts, IT managers, marketing professionals and manufacturers are also experimenting with how to use edge computing and smart data devices to bring in more revenue, improve decision-making processes and spot problems before equipment fails.

Taco Bell Programming

Everyone likes Taco Bell, right?

While many developers think it's important to keep up with the latest software development tools and languages, which it is, the philosophy behind Taco Bell programming flies in the face of this notion, and often comitting to them with no real compelling reason is asinine.

The mindset of a Taco Bell programmer is that almost every problem in software development has been encountered and solved in the past, so its more efficient to use what the programmer already knows well and solve the problem quickly even if its at the cost of style.

The idea of using Taco Bell in a programming analogy is credited to Ted Dziuba, who coined the term in a blog post he wrote in 2010. According to Dziubu, each time a new programming language, third-party service or line of code is used, it introduces the possibility of failure. In contrast, fixing problems Taco Bell style with a well-proven tool set, saves time in development, testing, training and meetings.

Noisy Neighbor

Knock it off!

Noisy neighbor is a cloud computing infrastructure co-tenant that monopolizes cloud resources and negatively affects other tenants. The noisy neighbor effect causes virtual machines and applications that share the infrastructure to suffer from uneven performance.

The cloud is a multi-tenant environment, which means that a single architecture hosts multiple customers' applications and data. The noisy neighbor effect frequently occurs when an application or virtual machine (VM) uses the majority of available bandwidth. Bandwidth carries data throughout a network, so when one application or instance uses too much, the applications on other VMs suffer from slow speeds or latency.

One way to avoid the noisy neighbor effect is to use a bare-metal cloud. The bare-metal cloud runs one application at a time directly on the hardware, which creates a single-tenant environment and helps eliminate noisy neighbors. Setting operations per second (IOPS) limits can also help prevent a single VM application or instance from monopolizing resources and hindering the performance of other tenants.

While single-tenant environments avoid the noisy neighbor effect, they can't solve the problem if infrastructure is over-committed or the cloud environment is shared by too many applications. When this happens, it becomes necessary to move workloads across physical servers to ensure each application receives its necessary resources.

Human Capital Management

Tag'em and Bag'em

Human capital management (HCM) is an approach to employee staffing that perceives people as assets (human capital) whose current value can be measured and whose future value can be enhanced through investment.

Industrial Internet of Things

Machines are users too...

The Industrial Internet of Things (IIoT) is the part of the Internet of Things (IoT) that focuses on how smart machines, networked sensors and sensor analytics can help improve business-to-business (B2B) initiatives across a wide variety of industries, especially manufacturing.

Also known as the Industrial Internet, IIoT seeks to make better use of the sensor data, machine-to-machine (M2M) communication, machine learning and automation technologies that have existed in industrial settings for years. The driving philosophy behind the IIoT is that smart machines are better than humans at accurately, consistently capturing and communicating data. The data that sensors and smart machines generate can be analyzed to pick up on inefficiencies and problems sooner, saving time and money and supporting business intelligence efforts. In manufacturing specifically, IIoT holds great potential for quality control, sustainable and green practices, supply chain traceability and overall supply chain efficiency.

Major concerns surrounding the IIoT include the current lack of interoperability between manufacturing systems and support for the robust infrastructure requirements that are needed to support big data and big data analytics. To that end, the nonprofit Industrial Internet Consortium, which was founded in 2014, is focusing on creating open standards, communication protocols and architectures that will help businesses use the data that networked sensors and smar machinery generates to become more efficient, productive and profitable.

Machine Learning

One of Top 10 Terms to watch in 2017!

Machine learning is an area of computer science and statistical modeling that allows a computer program to predict an outcome or make a decision without being explicitly programmed to do so.

Machine learning, which forms the basis for artificial intelligence (AI), is closely tied to data analytics and data mining programming. Both machine learning and data mining applications search use mathematical algorithms to search through data and look for patterns. However, instead of extracting data for human comprehension, as is the case in data mining applications, machine learning uses algorithms to detect patterns in data and adjust program actions accordingly.

Big data and cloud-based predictive analytics services are helping programmers and data scientists to take advantage of machine learning in new ways. For example, Facebook's News Feed changes according to the user's personal interactions with other users. If a user frequently tags a friend in photos, writes on his wall or "likes" his links, the user's News Feed will show more of that friend's activity due to presumed closeness.

Machine learning tasks are typically classified into three broad categories: supervised learning, unsupervised learning and reinforcement learning. In supervised learning, the computer uses input examples and their desired outputs to determine a general rule for mapping inputs to outputs. Spam filters use supervised learning. In unsupervised learning, the computer is given input examples and is asked to cluster things that are alike. Google news, which clusters random news articles into topics, uses unsupervised machine learning. In reinforcement learning, the computer gathers inputs from its surrounding to accomplish a task. Self-driving cars use reinforcement learning.

Structured Content

Structured content is a modular approach to managing digital content that uses metadata tags and automation to publish content from a single source to multiple distribution channels. Structured content allows content creators to enter text once and use rules-based publishing to tailor the output for a specific delivery platform. For example, content can be structured in such a way that on a desktop monitor, the entire content body is displayed, while on a mobile device only the summary displays.

Once content has been tagged, it is treated like data that can be accessed by a software application or an application programming interface (API). The higher the granularity of the metadata, the more structured the content becomes and the easier it is to use the content for different purposes. Tagged content is posted as one record and any changes to the record are applied to all instances of the content no matter where it lives.

The return of investment (ROI) for structuring data includes:

  • Increased productivity - there is no need to modify style, tables, images or page breaks for different device types.
  • Enhanced organic search results - the additional metadata allows search engines to discover content and index it more accurately.
  • Personalization capabilities - marketing content can easily be assembled in different ways to meet the needs of specific audience segments or named accounts.

Structured content and the ability to use a single block of content to target multiple audiences requires a content management system (CMS) that supports the creation, organization and storage of content independent from layout. Structured content management is especially useful for marketers who want to customize messaging for account-based marketing initiatives. It is also useful in highly regulated industries in which a company could find itself in legal trouble if the content the company produces is inconsistent.

Structured content may also be referred to as intelligent content or semantic content.