SolarWinds 2017 Predictions

By Head Geeks, SolarWinds.

  • 7 years ago Posted in
Move over SaaS: FaaS Gains Speed

The industry’s leading cloud service providers have introduced a new way to work in the cloud: Functions as a Service (FaaS). This new cloud computing category allows customers to develop, run and manage application functionalities without the headache of architecting and overseeing the backend infrastructure.

 

Launched in 2014 by AWS? Lambda (although in the last several years, similar solutions have been offered by Azure? and Google Cloud Platform?), the service is typically used to build microservices applications. It aims to reduce the barrier of consumption for programming languages by decoupling the platform architecture, allowing IT professionals to develop programs to perform specific tasks without the additional plumbing. The rise of a new chapter in cloud computing signals the continuing maturity of the cloud market from its roots as infrastructure as a service, to PaaS, SaaS, FaaS, and beyond.

 

We predict that in 2017, more specialised services, such as FaaS, will continue to proliferate, given the targeted efficiency that leads to both a better experience and an improved pricing structure. The ability to run nearly any type of application or function, with zero infrastructure administration on the IT professional’s behalf, is tremendously appealing. New services will likely focus on these benefits.

 

2.      Clarity on Containers

Containers from the likes of Google, Docker, CoreOS and Joyent continue to be a key area of discussion in the cloud computing space. In the past year, organisations across all major industries, from finance to e-commerce, took notice of containers as an exciting new method of operating system virtualisation. However, this broader industry awareness led to a wave of rapid adoption without a fundamental understanding of the differences between containers and virtual machines.

 

Despite many early adopters implementing them as such, containers are not part and parcel replacements for virtual machines. In short, a container consists of an entire runtime environment—an application, its dependencies, libraries and other binaries, and configuration files needed to run it—bundled into one package designed for lightweight, short-term use. When implemented correctly, containers enable much more agile and portable software development environments.

 

We predict that in 2017, IT departments at large will finally come to a greater understanding of the fundamentals of container technology and how it can realistically and appropriately be used for IT operations alongside virtual infrastructure. For example, by packaging workloads into containers, which can spin up very quickly on other vendor platforms if needed, IT professionals can gain some of that management and control element back from public cloud infrastructure SLAs. Early adopters that likely implemented containerisation in the last several years without a specific strategy in mind will also need to reassess their initial deployments to determine whether they are seeing any noticeable benefits.

 

The proliferation of containers as a computing strategy within IT departments will simultaneously give rise to greater security concerns—such as the risk of multiple containers being hacked through a single host OS kernel and, similar to VMs, sprawl—and the need for IT professionals that are skilled in specific coding languages related to containerisation.

 

3.      Data Breaches on the Rise (Again)

It seems data breaches are always top-of-mind for coming-year predictions, but every year the concern grows. Just this past month, it was announced that in 2014, Yahoo!? fell victim to the biggest data breach in history, losing nearly 500 million accounts’ worth of personal user data to attackers. This trend shows no sign of slowing down.

 

We predict that in 2017, there will be exponential increases in both the volume and visibility of data breaches, particularly for large corporations.

 

In the next year, more companies will also begin to appreciate the volume and severity of these attacks on data. To combat them, we expect to see a new crop of information security firms enter the marketplace to provide guidance on penetration testing and other security expertise. At the same time, however, added government funding for the Cybersecurity National Action Plan (CNAP) means that we will likely see an increased number of individuals billing themselves as security experts when that title may only be loosely applicable; organisations seeking counsel should be wary of these new “experts.”

 

Simultaneously, this increase in data breaches will force organisations to weigh the implications of potential data loss against the expense of hiring security experts. In many cases, businesses in 2017 will choose to take a calculated risk about what they can “afford to lose” rather than what it costs to prevent data loss entirely. This response will be especially true in the case of ransomware attacks, when it is nearly impossible to guarantee that hackers will not leak or reveal stolen data even after receiving the “ransom” payment.

 

Finally, corporations and IT professionals must become hyperaware of attackers’ increased ability to take advantage of automation. The speed and ease with which an automated network breach can take place is new, and will ultimately aid in making corporate data breaches even more commonplace in 2017.

 

4.      To Blockchain or Not to Blockchain

Blockchain technologies, which are peer-to-peer ledger technology that began with bitcoin, have been on the minds of financial institutions for a number of years, but in 2016, the technology expanded beyond the bounds of the financial industry. At a time when companies are finding it difficult to deal with data management and security, blockchain provides a seemingly perfect solution. Supply chain use cases, for example, are now being tested and implemented to ensure the safety, security, and integrity of the information associated with these processes.

 

In 2016, a multitude of companies – financial firms and organisations in particular – began to test blockchain in lab environments, yet few are using it in practice. One of the roadblocks is agreeing on a common protocol, which has yet to take place. The other roadblock, of course, stems from a lack of proven security practices, especially for financial organisations.

 

We predict that in 2017, blockchain will gain steam as a buzzword and much more research will go into the technology and its functionality, although it’s unlikely there will be any effective new capabilities able to be readily implemented. However, 2018 will prove to be much more action-packed in terms of companies figuring out how to take advantage of data ledger technology to solve their data management issues.

 

5.      Shifting IT Roles

As traditional, siloed IT roles—network administrators, storage administrators, systems administrators, database administrators, and more—continue to take on new responsibilities, such as working with cloud service providers in hybrid environments, implementing new technologies, like containers and microservices, and acting as an IT liaison to business leaders, 2017 signals the return of the age of education and certification.

 

The ability to quickly learn new IT concepts and skills will be more important than being an expert in any one technology. While siloed experts who managed disparate parts of the infrastructure and application stack played a fundamental role in the traditional IT department, the modern data center is more interconnected than ever. As a result, IT generalists – who know a little bit about everything, have a holistic understanding of the application stack and can make quick, informed decisions about new technology – will be particularly successful in 2017 and beyond.

 

More specifically, the introduction of new machine-based technologies alongside the continued adoption of a DevOps culture, which encourages a de-siloed IT department, will require IT professionals to focus on developing new skillsets and certifications to operate and manage next-generation data centers:

 

·        Rise of the Machines: The integration of new machine-based technologies like bots and artificial intelligence, which aim to automate basic processes and search functions, will require the introduction of new management and monitoring processes. For organisations that choose to leverage this technology, IT professionals must determine not only which team or administrator will be required to own the deployment and maintenance of this technology, but what monitoring standards must be applied, what security protocols to follow, and so on.

 

·        DevOps: The adoption of DevOps shows no sign of slowing down in 2017. For the uninitiated, DevOps is a culture and mindset geared toward software development where development and operations teams collaborate on taking the intelligence of how an application runs to inform and improve how the application is being built. In fact, expect the DevOps culture to permeate even more IT shops as the benefits of a streamlined approach to troubleshooting, remediation, and improved end-user experience are increasingly sought after.

 

6.      Hybrid IT: Not Just a Concept, but a Reality

According to McKinsey Research, businesses have reported plans to reduce workloads located in on-premises environments, moving many to dedicated private cloud, virtual private cloud, and public infrastructure-as-a-service (IaaS) providers—all of which are expected to see greater adoption rates in the coming years. There’s no doubt that hybrid IT is the reality for the majority of organisations today and in the foreseeable future. And not only that, the center of technology itself is becoming increasingly hybrid. IT professionals must start thinking about management in a hybrid context. But what does it actually look like in practice?

 

In 2017, IT and business leaders will decide on specific solutions as they implement hybrid IT. For example, they may choose to use Office 365? and Skype? for Business while hosting the identity management solution, Active Directory? Federated Services, on-premises. Alternatively, the cloud has proven to be the best platform for virtual desktop infrastructure (VDI), delivering organisations the required flexibility and elasticity to provision and de-provision virtual desktops in bulk. By migrating this workload to the cloud, an organisation is able to relieve its IT professionals of the need to manage that infrastructure directly and refocus efforts on other on-premises projects.

 

Over the next several years, IT departments must exercise their growing responsibility to act as a technology liaison for business management by staying informed and making smart decisions when it comes to cloud, even if the decision is to do nothing in the near future because there is no immediate need for change. The key is to build a hybrid IT roadmap that integrates cloud adoption based on a per-workload and per-application basis to achieve a more agile, available, scalable, and efficient data center.

 

By Kashif Nazir, Technical Manager at Cloudhouse.
By Richard Eglon, CMO, Nebula Global Services.
By Graham Jarvis, Freelance Business and Technology Journalist, Lead Journalist – Business and...
By Krishna Sai, Senior VP of Technology and Engineering.
By Thomas Kiessling, CTO Siemens Smart Infrastructure & Gerhard Kress, SVP Xcelerator Portfolio...
By Aleksi Helakari, Head of Technical Office, EMEA, Spirent and Patrick Johnson, CMO, APNT - a...
By Dave Longman, Head of Delivery, Headforwards.
It’s getting to the time of year when priorities suddenly come into sharp focus. Just a few...