IT Network

9 Trends That Will Impact Your IT Network

Data centric operations are changing the way we work – and placing new demands on your IT network. Here are nine new trends you need to be aware of – can your current network cope?

1. Cloud hosted apps

The unbeatable flexibility provided by public cloud platforms makes them ideal for new app deployments. Containerisation and micro services are increasing in popularity because they offer unrivalled portability and resource control – but they also rely on uninterrupted connectivity between network edge, core and cloud data centre to perform adequately.

2. Distributed apps

Interconnected micro services can be hosted anywhere – on-site, at the network edge or in the cloud. Location is determined by performance needs – and again, reliable, speedy connectivity is critical.

3. Continuous development

Agile development and fail-faster methodologies result in continuous delivery of updates apps. The development team need a network infrastructure that allows them to increase the speed of production and delivery, whilst containing operational costs.

4. Virtual becomes serverless

Moving away from the concept of servers (physical or virtual) requires a different approach to infrastructure architecture. According to Cisco, future networks will be built around “nerve clusters”, mini networks located where the data is, with a reliable backbone to connect each cluster as required.

5. IoT goes mainstream

Smart sensors and IoT devices are no longer the preserve of manufacturing or self driving cars. The ability to capture – and action – real-time data can be used in a broad range of industries. As well as improving connectivity between edge IoT devices and the network core, network administrators will need a more flexible way to manage them. Infrastructure will have to become smarter to allow administrators to identify and classify connected devices and to apply policies that maintain performance without impacting other networked assets.

6. Here comes AI

Using Artificial Intelligence (AI) to automate and accelerate operations relies on the ability to access and process data quickly. As AI adoption grows, more processing will take place at the network edge. Network infrastructure will have to be capable of delivering information to AI engines in near real time in order to succeed. This will require improvements in connectivity between network edge, core and the cloud depending on where computation is being performed.

7. We’re all mobile now

Cisco once predicted that mobile data traffic would increase at annual growth rate of 42% – but that was before the 2020 global pandemic shut down offices across the world. That estimate now looks increasingly conservative. Workforces are likely to remain highly distributed and mobile for the foreseeable future – or even permanently. Accessing corporate systems from a range of devices outside the company network decreases visibility and control. Careful thought will have to be given as to how to control access to resources, particularly as IoT devices further increase network complexity and ‘noise’.

8. Cybersecurity must get smarter

As corporate systems extend outside the network perimeter, the attack surface available to hackers increases. Cyberattacks are increasingly sophisticated, so businesses will need to online investing in network infrastructure that allow them to identify, contain and mitigate threats. These protections will need to be extended to cloud environments too, providing similar defences for data and applications hosted outside the network perimeter.

9. AR and VR are finally happening

Augmented Reality and Virtual Reality technologies have begun to mature, moving from consumer novelty to business productivity tool. New applications include improved collaboration, training and even remote working ‘experiences’. But every productivity gain comes at a cost, increasing demand on your network resources. The future-ready network will need to deliver improved end-to-end throughput with minimal latency. Using dynamic performance controls will help to guarantee a decent end-user experience and ensure that other mission-critical activities are not impacted without overwhelming the network administrator.

The future is more

Clearly all nine of these trends have one thing in common – more network resources. Or more specifically, more efficient, flexible network resources that will support changing workloads and priorities. Without planning for these significant changes soon, businesses may find they are unable to support the applications they need in future.

To learn more about how WTL and Cisco can help you meet these challenges head-on, please get in touch.

Useful Links

Cisco – 2020 Global Networking Trends Report

Edge Computing is neeeded to copy with the amount of data generated

Life on The Edge – The Benefits of Edge Computing

In the first of the “Living on the Edge” blog series we discussed the challenges faced by organisations embracing IoT and similar technologies. These barriers are not insurmountable – and the benefits for doing so are considerable. In this second instalment we outline some of the reasons why your business should be investing in edge computing.

Facilitate true autonomy

Automating production lines has been instrumental in helping to reduce costs and maximise output. But most systems are only capable of operating according to a narrow set of pre-defined rules.

Industry 4.0 demands that automation is not only quicker, but also smarter. By moving data processing capabilities to the edge, it becomes possible to implement real-time autonomy.

Autonomous systems are not only faster, but with the application of machine learning can be trained to make “intelligent” decision.

Improve production quality

By removing humans from the equation, autonomous systems are not only faster, but also more consistent. There is far less potential for human error to affect output.

This is nothing new; consider automation of production lines which was a hallmark of the original Industrial Revolution. What has changed is the introduction of autonomy using computing at the edge.

Consistently high-quality output, increased efficiency and faster decision making will help to reduce costs and increase margins.

Better manage data growth

IoT sensors generate enormous amounts of data, placing additional pressure on your network and storage infrastructure. Processing incoming data at the edge allows you to not only action information in real time, but also to better manage what happens next.

With edge computing you can decide whether data is passed back into back-end systems for further processing analysis, archived to the cloud or discarded entirely. Managing data at the edge will help to constrain data growth by ensuring information is moved to the most appropriate location.

Avoid downtime and production bottlenecks

Sensors provide real time insights into system health – but the data they generate can also be analysed to reveal trends over the longer term. Edge computing provides a mechanism to identify the early warning signs of an impending system failure for instance.

These insights can then be used to proactively schedule maintenance cycles for instance. Rather than delaying repairs and inspections until it is too late, you can carry out a maintenance routine that prevents a full system failure and actually improves operations and out in advance.

Realise your data-driven ambitions

Digital transformation is fully reliant on the way that your business handles and actions data. Edge computing provides real-time autonomy at the very point where your products are being produced or customers engaged.

Edge computing will also instrumental in helping you apply insights generated from other activities. Without the necessary systems at key interfaces, you will be unable to properly action the findings of your big data analytics efforts. With the right edge computing infrastructure in place, these benefits (and more) become available to almost any business.

Useful Links

Seagate optimizes manufacturing with edge computing and AI analytics

Refinery of the future: Texmark Chemicals transforms the way it does business

CenterPoint Energy: Smart energy, fuelled by data

Edge Computing

Life on The Edge – Facing the Challenge of Edge Computing

Businesses face an unusual dilemma as they prepare for a data-driven future. First is big data analytics, centralising data to ensure as much information is available as possible. Second is automation, applying data to increase efficiency and reduce manual intervention.

This creates a problem however. In order to function correctly, automation systems needs to process and action data at the point of collection, at the edge of the network. This is in complete opposition to the centralised model favoured by data-driven industry.

What does this actually look like?

Take self-driving cars for instance. Each vehicle is equipped with thousands of sensors to navigate routes and avoid collisions. In order to succeed, information must be processed in real time – the vehicle cannot tolerate any latency, ruling out cloud-based systems.

At the same time, vehicle manufacturers need to collect data from onboard sensors to drive product development and safety improvements. And this is where centralised cloud systems do make sense.

Autonomous vehicles are just one example of this dilemma. Factories, retailers, operators and producers all face the same challenge as they try to embrace the data-driven future. Any business deploying smart sensors, IoT devices and predictive analytics will encounter similar issues.

Ever-increasing data volumes

The introduction of IoT devices has exponentially increased the volumes of data being generated. Each sensor can output multiple messages every second. Although small in size, each signal needs to be analysed and actioned immediately.

In most cases, sensor output is nothing more than ‘status ok’ type messages and can be safely ignored, and simply sent to archive storage. In fact, it may be perfectly reasonable to discard them entirely as they offer little long-term value.

Without rules that filter and direct this constant stream of information, businesses will see their data capacity requirements – and costs – escalate even faster than anticipated. The right information must be retained however, otherwise the results of your predictive analytics efforts will be unbalanced or incomplete.

The fundamental challenges you face

In order to succeed in a data-driven operating environment, your business needs to adapt to computing at the edge. You will need to address:

  • How to provide adequate processing power to deal with incoming data in real time.
  • How to specify storage for machine-generated information.
  • How to provide sufficient network bandwidth between the edge, data centre and cloud.

With the right infrastructure, these challenges can be overcome. And the benefits of edge computing are significant – you can read more in part 2 of this blog series next week.

Useful Links

Dr. Tom Bradicich, HPE, The Intelligent Edge: What it is, what it’s not, and why it’s useful

When OT and IT collide: Managing convergence on the industrial edge

IDC FutureScape: Worldwide IoT 2019 predictions (Analyst report)

2018 National Retail Security Survey (National Retail Federation report)

The merging of cybersecurity and operational technology (ISACA and ISA report)

Mining 24 hours a day with robots (MIT Technology Review)

Shows opened padlock to represent security threat

Why automation will become the most reliable way of preventing, detecting, and mitigating security threats

Modern organisations are taking advantage of new and innovative technology, transforming their business operations, continuously improving efficiencies, delivering high levels of customer service, and unearthing new opportunities for products and services that wouldn’t have been conceivable 5-10 years ago. This transformation comes at a price however, and the same technologies used to drive businesses forward are also being deployed maliciously, primarily for financial gain (71% of data breaches were financially motivated, according to Verizon’s 2019 Data Breach Investigations Report) or to gain a strategic advantage. Businesses face greater numbers of security related events more frequently and in different guises than they did five years ago, with attacks on individuals using social channels becoming more prevalent. Alongside this, workforces are hypermobile, well used to downloading applications and accessing, storing and transmitting corporate data anywhere and on any device. In order to keep this edge data secure, businesses must now do more than simply protect against attacks, they must try and prevent them from happening in the first place, wherever the user happens to be and whatever device they happen to be using.

So how do you do that?

The first step is to identify genuine threats from the vast swathes of security incident data that is collected for analysis from a myriad of different sources. They are deliberately not easy to spot, and attackers will use next generation technologies such as AI to hide amongst legitimate traffic. However, some AI and machine learning driven security solutions can analyse massive amounts of data from across any number of data sources, using the power of the cloud to process the analysis right across the organisation, from the edge to the core.

Oracle is one such security solution, enabling businesses to secure modern hybrid clouds with a set of security and management cloud solutions, which draw on data gathered from logs, security events, external threat feeds, database transactions and applications. It uses AI and machine learning technology to detect malicious intentions, then automates the process of finding available security patches and applying them, and all without downtime.

In addition, Oracle’s automated services encrypt production data and enforce user controls, so you don’t have to do it manually.

As we’ve mentioned, to protect data from edge to core, organisations must implement a multi-layered strategy, and when using the cloud, don’t assume that all data protection responsibility lies with the cloud provider. Most cloud providers assume a shared responsibility model, where they offer assurances around the security of the data held on their infrastructure, but access to that data and SaaS data is usually the responsibility of the customer. Consider layering your security solutions to protect every layer of data and each access point, including a Cloud Access Security Broker and Identity Access Manager which will monitor, detect threats, automate the identity management process, sending alerts if anomalous behaviour occurs and remediate wherever possible, without the need for human intervention. Making this work across heterogenous technology on different platforms, on-premises, in the public cloud and in private clouds, is the trickiest part, but Oracle has got it absolutely spot on. Consider the manual alternative, thousands or even millions of security alerts coming into different management consoles, to be sifted through, users to be authorised and behaviour to be monitored and analysed, patches sought and applied and data to be encrypted. It doesn’t bear thinking about.

WTL offer a range cybersecurity solutions which employ next-generation features to ensure you remain one step ahead of the cybercriminals.

Useful Links

Verizon’s 2019 Data Breach Investigations Report

Oracle Cloud Essentials – Secure and Manage Hybrid Clouds

Oracle’s Top 10 Cloud Predictions 2019