The top 10 technology trends for 2019

| November 20, 2018

A new report from Gartner has highlighted the top strategic technology trends that organisations need to explore in 2019.

Gartner defines a strategic technology trend as one with substantial disruptive potential that is beginning to break out of an emerging state into broader impact and use, or which are rapidly growing trends with a high degree of volatility reaching tipping points over the next five years.

The top 10 strategic technology trends for 2019 are:

Autonomous Things
Autonomous things, such as robots, drones and autonomous vehicles, use AI to automate functions previously performed by humans. Their automation goes beyond the automation provided by rigid programing models and they exploit AI to deliver advanced behaviours that interact more naturally with their surroundings and with people.

“As autonomous things proliferate, we expect a shift from stand-alone intelligent things to a swarm of collaborative intelligent things, with multiple devices working together, either independently of people or with human input,” said Mr. Cearley. “For example, if a drone examined a large field and found that it was ready for harvesting, it could dispatch an “autonomous harvester.” Or in the delivery market, the most effective solution may be to use an autonomous vehicle to move packages to the target area. Robots and drones on board the vehicle could then ensure final delivery of the package.”

Augmented Analytics
Augmented analytics focuses on a specific area of augmented intelligence, using machine learning (ML) to transform how analytics content is developed, consumed and shared. Augmented analytics capabilities will advance rapidly to mainstream adoption, as a key feature of data preparation, data management, modern analytics, business process management, process mining and data science platforms.

Automated insights from augmented analytics will also be embedded in enterprise applications — for example, those of the HR, finance, sales, marketing, customer service, procurement and asset management departments — to optimise the decisions and actions of all employees within their context, not just those of analysts and data scientists. Augmented analytics automates the process of data preparation, insight generation and insight visualisation, eliminating the need for professional data scientists in many situations.

“This will lead to citizen data science, an emerging set of capabilities and practices that enables users whose main job is outside the field of statistics and analytics to extract predictive and prescriptive insights from data,” said Mr. Cearley. “Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists. Organisations can use citizen data scientists to fill the data science and machine learning talent gap caused by the shortage and high cost of data scientists.”

AI-Driven Development
The market is rapidly shifting from an approach in which professional data scientists must partner with application developers to create most AI-enhanced solutions to a model in which the professional developer can operate alone using predefined models delivered as a service. This provides the developer with an ecosystem of AI algorithms and models, as well as development tools tailored to integrating AI capabilities and models into a solution.

Another level of opportunity for professional application development arises as AI is applied to the development process itself to automate various data science, application development and testing functions. By 2022, at least 40 percent of new application development projects will have AI co-developers on their team.

“Ultimately, highly advanced AI-powered development environments automating both functional and non-functional aspects of applications will give rise to a new age of the ‘citizen application developer’ where non-professionals will be able to use AI-driven tools to automatically generate new solutions. Tools that enable non-professionals to generate applications without coding are not new, but we expect that AI-powered systems will drive a new level of flexibility,” said Mr. Cearley.

Digital Twins
A digital twin refers to the digital representation of a real-world entity or system. By 2020, Gartner estimates there will be more than 20 billion connected sensors and endpoints and digital twins will exist for potentially billions of things. Organisations will implement digital twins simply at first. They will evolve them over time, improving their ability to collect and visualise the right data, apply the right analytics and rules, and respond effectively to business objectives.

“One aspect of the digital twin evolution that moves beyond IoT will be enterprises implementing digital twins of their organisations (DTOs). A DTO is a dynamic software model that relies on operational or other data to understand how an organisation operationalises its business model, connects with its current state, deploys resources and responds to changes to deliver expected customer value,” said Mr. Cearley. “DTOs help drive efficiencies in business processes, as well as create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically.”

Empowered Edge
The edge refers to endpoint devices used by people or embedded in the world around us. Edge computing describes a computing topology in which information processing, and content collection and delivery, are placed closer to these endpoints. It tries to keep the traffic and processing local, with the goal being to reduce traffic and latency.

“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components,” said Mr. Cearley. “Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organisation’s position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond ‘are we compliant’ toward ‘are we doing the right thing.’”

Quantum Computing
Quantum computing (QC) is a type of nonclassical computing that operates on the quantum state of subatomic particles (for example, electrons and ions) that represent information as elements denoted as quantum bits (qubits). The parallel execution and exponential scalability of quantum computers means they excel with problems too complex for a traditional approach or where a traditional algorithms would take too long to find a solution.

Industries such as automotive, financial, insurance, pharmaceuticals, military and research organisations have the most to gain from the advancements in QC. In the pharmaceutical industry, for example, QC could be used to model molecular interactions at atomic levels to accelerate time to market for new cancer-treating drugs or QC could accelerate and more accurately predict the interaction of proteins leading to new pharmaceutical methodologies.

“CIOs and IT leaders should start planning for QC by increasing understanding and how it can apply to real-world business problems. Learn while the technology is still in the emerging state. Identify real-world problems where QC has potential and consider the possible impact on security,” said Mr. Cearley. “But don’t believe the hype that it will revolutionise things in the next few years. Most organisations should learn about and monitor QC through 2022 and perhaps exploit it from 2023 or 2025.”

Gartner clients can learn more in the Gartner Special Report “Top 10 Strategic Technology Trends for 2019.” Additional detailed analysis on each tech trend can be found in the Gartner YouTube video “Gartner Top 10 Strategic Technology Trends 2019.”

SHARE WITH: