Technology continues to evolve at an incredible rate, but the following technology trends will shape the decade to come.
As the first month of the new year comes to an end, I wanted to take this chance and look forward at the tech industry and the innovations I believe we will see in 2020. Taking knowledge and experiences from the past year, combined with my years in cloud and colocation, I’ve mapped out seven predictions that will shape our technology landscape in the coming year.
In the months ahead, drivers like Edge computing, SD-WAN adoption, and active deployments for greater uptime will reshape how technology affects every industry. These predictions are based on our experience in the infrastructure industry.
Take a look at what is ahead for technology, and learn more about:
- Repatriating public cloud workloads
- Empowering the Edge
- Leveraging cloud disaster recovery
- Managing hybrid digital infrastructure
- Complying with data privacy controls
- Cloud networking
#1 Repatriating Public Cloud Workloads
In 2020, we will see public cloud repatriation start to grow. Government regulations, pricing, control, and security will continue to force enterprises to re-evaluate the most reliable, cost effective, and compliant venues for their workloads.
Customers are now realizing that not one cloud fits all, and multi-vendor cloud solutions, including some customer owned equipment, are required to responsibly operate their businesses.
An IDC survey from 2018 discovered that 80 percent of respondents moved cloud workloads on-premises or to a private cloud solution within the past year.
Organizations of all sizes are transitioning, yet it is the larger companies that are making the biggest waves. For example, Dropbox, which was created in the cloud and used Amazon Web Services (AWS), recently moved 600 petabytes of data off the public cloud and adopted a hybrid approach, reported Data Center Knowledge. Dropbox said that the move away from AWS was able to offer them faster performance, while saving them up to $74.6 million in operational expenses.
According to numerous surveys and reports, cloud repatriation is not slowing down anytime soon. 451 Research’s Voice of the Enterprise (VoTE) Cloud Transformation, Organizational Dynamics survey stated that 34 percent of respondents had already moved their workloads from a public cloud to a private environment (cloud or otherwise).
A private cloud service offers customers transparency into pricing structures, and the pay-as-you-go approach allows companies to only pay for what they need. According to Standford research, Dr. Johnathan Koomey, enterprises are wasting up to $62 billion per year paying for public cloud capacity they do not need.
Is it Time to Say Goodbye to Public Cloud?
Everyone’s situation is different. Before you migrate all of your workloads out, ask yourself the following:
- Are you wasting money paying for capacity that is not being used?
- Are you experiencing regular performance and availability issues?
- Are you regulated against different compliance standards?
- Do your workloads require low latencies?
- Do you have the IT staff bandwidth to take control of your workloads?
If the answers were all no, then public cloud may be the right solution for you. But, if you answered yes to any of the questions above, it might be time to consider public cloud repatriation.
The key is to perform ongoing cost and performance evaluations to determine whether public cloud or private deployments are best for your workloads and your business.
#2 Empowering the Edge
A digital network is a transportation system for information. In this network, an Edge is comprised of servers extended as far out as possible, to reduce the time it takes for users to access whatever it is they need. Edge computing is beginning to crawl in terms of maturity.
We will continue to see Edge data processing and storage requirements explode in the new decade. Data creation and consumption has continued to increase at the speed of Moore’s law, and physics will limit the distance upon which this data can travel and be effectively used.
Edge is also very much tied to the success and maturity of 5G and IoT. Computing infrastructure locality will become more important as 5G, IoT, and AI become mainstream. The word “Edge” in this context means literal geographic distribution.
Edge computing is computing that is done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It does not mean the cloud will disappear. It means the cloud is coming to you.
Applications may be expedited when their processors are stationed closer to where the data is collected. The future of both the technology industry and computing markets may depend on how those points on the geographical map and the points on the network map are finally interfaced.
Where those points reside, especially as 5G Wireless networks are being built, may end up determining who gets to control them, and who gets to regulate them.
#3 Leveraging Cloud Disaster Recovery
In the coming year, I predict that more organizations across industries will seek the reliability of a hybrid cloud service provider. A large percentage of mission critical applications will not be redesigned for application disaster recovery. This will require redundancy at every level, starting at the application layer and continuing up into the data center, network, storage, and computing layer.
Disaster recovery will also shift further to active/active deployments for greater uptime with air-gapped data backups to protect against ransomware threats. There are many disaster recovery solutions that say a business can recover, but in reality, their definition of recovering is not the same as the customer’s.
We will continue to see businesses taking a step towards active/active deployments as the costs and barriers are coming down. True recovering is a complicated process. It requires much more than just pushing a VM to the cloud. As always, networking is a key piece to this puzzle.
When it comes down to it, the 3/2/1 mantra is still very relevant. Keep three copies of data on two types of storage, with one in a remote location.
On the other hand, data gravity works to keep data in its current location, preventing it from moving to the cloud. For some companies, data gravity will continue to hinder digital transformation again this year.
#4 Managing Hybrid Digital Infrastructure
Hybrid digital infrastructure management will emerge as a critical IT capability that an organization and all of their vendors must actively participate in. The scale and complexity of managing hybrid infrastructure is increasing exponentially as integrated tools emerge.
Infrastructure and data are located everywhere. Management tools will continue to be deployed to help address compliance, security, and control over distributed cloud workloads.
Leadership will have to select the right mix of cloud and traditional IT for their specific business objectives. The mix of many different elements such as edge, hybrid cloud, workflow and management creates complex infrastructures.
Gartner recommends a focus on workflow visualization, using an in integrated toolset and developing a center of excellence to work on the issues. Enterprise cloud options will continue to reshape the IT landscape. The ongoing IT migration from on-premises data centers to colocation and cloud providers is a multi-year transition.
We will continue to see a rational reallocation of enterprise IT resources in 2020.
#5 Complying with Data Privacy Controls
Companies or their underlying IT providers that misuse customer data will lose the trust of their customers. Data privacy is important to consumers because a breach of personal information can damage an individual’s fundamental rights and freedoms, including the risk of identity theft and other types of fraud. Data privacy concerns are also top of mind for organizations. Any unauthorized collection, careless processing, or poor protection of data can introduce multiple risks.
As a result, the EU’s GDPR (General Data Protection Regulation) had an important impact. Many states within the United States have since adopted their own privacy protection laws, such as the CCPA (California Consumer Privacy Act). We should expect more regulations and awareness in the future, as the U.S. currently is working to implement federal data privacy laws.
Trust worthiness is a key factor in driving an organizations revenue and profitability. Building customer trust is difficult, but losing it is easy. Actively managing the location and use of an organization’s data will be table stakes for organizations.
You may start to notice links or buttons at the bottom of the page, reading “Do Not Sell My Personal Information.” This change is one of many going into effect this month thanks to the new data privacy law known at the California Consumer Privacy Act.
This law essentially allows users to access the personal data that companies have collected on them, and have it deleted to prevent it from being sold to third parties. Since it is a lot more work to create a separate infrastructure just for California residents to opt out of the data collection industry, these requirements will transform the internet for everyone. With the new regulations empowering users to protect their data, it is finally time for better data privacy.
As always, cybersecurity is top of mind in the technology industry. Currently, there are some concerns surrounding Domain Name Systems (DNS) with the system potentially fracturing into DNS over the HTTPS movement or encrypted DNS queries, which are being pushed for privacy reasons.
DNS, by any account, is the most successful distributed information system humans have ever created. Another cybersecurity concern is crypto apocalypse, which is predicted to happen in 2020.
Basically, anything below 256-bit encryption is fairly easy to decrypt with quantum machines of sufficient size that now exist at that scale. This will spur a different kind of Y2K issue of getting devices swapped out with higher-end encryption capabilities in hardware or code to use new ciphers.
In today’s landscape, breaches are inevitable and perimeter security is no longer an adequate solution. Once you have located sensitive data, you need to apply security to the data itself.
Encryption enables you to apply protection directly to the data. Even if the perimeter is breached, you can be sure that your information remains secure – wherever it resides. Remember, encryption is only as strong as its key management counterpart.
#7 Cloud Networking
Users today expect applications and access to the Internet to be available 100 percent of the time. In order for IT professionals and business leaders to accomplish that goal, you cannot have single points of failure. Having a single circuit connecting an office is a guaranteed way to fail.
In today’s telecom environment, businesses can find cost effective options from two or more carriers in a majority of situations. With new 5G and Satellite options on the horizon, plus extensive investment in fiber for underserviced areas, the competition, speeds, and cost continue to improve.
“By 2022, 70 percent of data will be created outside the data center or cloud, up to 40 percent from today,” said David Cappuccio, Distinguished Analyst at Gartner. “How to we build a network to move around the cloud?”
When considering investing in an SD-WAN solution, be sure to pick a vendor with maturity in these key features:
- Link/Carrier diversity: Actively and automatically blend multiple circuits for bandwidth and availability.
- Automated performance routing: Optimize and prioritize applications between users and services in real-time without manual intervention. This goes beyond quality of service (QoS) and simple link re-route on circuit failures.
- Quality of experience: Ensure real-time applications have resilience by reserving bandwidth for real-time flows and packet duplication across multiple links to ensure delivery of critical voice and video streams.
- Security: Ensure tunnels are encrypted using secure ciphers and robust key management for both the data plane and management plane, as well as, hardened appliances.
- Centralized management: Look for a single pane-of-glass management and monitoring of the entire SD-WAN without having to login to each appliance to deploy, make changes, or troubleshoot issues.
- Telecom cost optimization and lifecycle management: Add or remove a telecom circuit with no outage, 3+ link support, bandwidth, packet-loss, and latency information for each link to ensure contracted bandwidth and SLA’s are met by carriers.
- Monitoring and trending: Provide visibility and optimization of the user experience at remote sites to support performance troubleshooting as well as circuit/telecom troubleshooting.
Start your Decade Off Right
From secure and always on colocation, to the compliance, control, and flexibility of cloud, LightEdge has you covered.
With over 20 years in business, LightEdge offers a full stack of best-in-class IT services to provide flexibility, security, and control for any stage of a customer’s technology roadmap. Our solutions include premier colocation across seven purpose-built data centers, industry-leading private Infrastructure as a Service (IaaS) and cloud platforms, and the top global security and compliance measures.
Our owned and operated facilities, integrated disaster recovery solutions, and premium cloud choices make up a true Hybrid Solution Center model. LightEdge’s highly-interconnected data center facilities now span Des Moines, IA, Kansas City, MO, Omaha, NE, Austin, TX and Raleigh, NC.
- Backups and Redundancy: Why Your Business Needs Both
- Data Privacy: How To Secure Big Data In 2020
- Gartner Highlights Top 10 Trends Impacting Infrastructure And Operations In 2020
- The Future Of IT Security: What To Know And How To Survive