Edge Computing is neeeded to copy with the amount of data generated

Life on The Edge – The Benefits of Edge Computing

In the first of the “Living on the Edge” blog series we discussed the challenges faced by organisations embracing IoT and similar technologies. These barriers are not insurmountable – and the benefits for doing so are considerable. In this second instalment we outline some of the reasons why your business should be investing in edge computing.

Facilitate true autonomy

Automating production lines has been instrumental in helping to reduce costs and maximise output. But most systems are only capable of operating according to a narrow set of pre-defined rules.

Industry 4.0 demands that automation is not only quicker, but also smarter. By moving data processing capabilities to the edge, it becomes possible to implement real-time autonomy.

Autonomous systems are not only faster, but with the application of machine learning can be trained to make “intelligent” decision.

Improve production quality

By removing humans from the equation, autonomous systems are not only faster, but also more consistent. There is far less potential for human error to affect output.

This is nothing new; consider automation of production lines which was a hallmark of the original Industrial Revolution. What has changed is the introduction of autonomy using computing at the edge.

Consistently high-quality output, increased efficiency and faster decision making will help to reduce costs and increase margins.

Better manage data growth

IoT sensors generate enormous amounts of data, placing additional pressure on your network and storage infrastructure. Processing incoming data at the edge allows you to not only action information in real time, but also to better manage what happens next.

With edge computing you can decide whether data is passed back into back-end systems for further processing analysis, archived to the cloud or discarded entirely. Managing data at the edge will help to constrain data growth by ensuring information is moved to the most appropriate location.

Avoid downtime and production bottlenecks

Sensors provide real time insights into system health – but the data they generate can also be analysed to reveal trends over the longer term. Edge computing provides a mechanism to identify the early warning signs of an impending system failure for instance.

These insights can then be used to proactively schedule maintenance cycles for instance. Rather than delaying repairs and inspections until it is too late, you can carry out a maintenance routine that prevents a full system failure and actually improves operations and out in advance.

Realise your data-driven ambitions

Digital transformation is fully reliant on the way that your business handles and actions data. Edge computing provides real-time autonomy at the very point where your products are being produced or customers engaged.

Edge computing will also instrumental in helping you apply insights generated from other activities. Without the necessary systems at key interfaces, you will be unable to properly action the findings of your big data analytics efforts. With the right edge computing infrastructure in place, these benefits (and more) become available to almost any business.

Useful Links

Seagate optimizes manufacturing with edge computing and AI analytics

Refinery of the future: Texmark Chemicals transforms the way it does business

CenterPoint Energy: Smart energy, fuelled by data

Edge Computing

Life on The Edge – Facing the Challenge of Edge Computing

Businesses face an unusual dilemma as they prepare for a data-driven future. First is big data analytics, centralising data to ensure as much information is available as possible. Second is automation, applying data to increase efficiency and reduce manual intervention.

This creates a problem however. In order to function correctly, automation systems needs to process and action data at the point of collection, at the edge of the network. This is in complete opposition to the centralised model favoured by data-driven industry.

What does this actually look like?

Take self-driving cars for instance. Each vehicle is equipped with thousands of sensors to navigate routes and avoid collisions. In order to succeed, information must be processed in real time – the vehicle cannot tolerate any latency, ruling out cloud-based systems.

At the same time, vehicle manufacturers need to collect data from onboard sensors to drive product development and safety improvements. And this is where centralised cloud systems do make sense.

Autonomous vehicles are just one example of this dilemma. Factories, retailers, operators and producers all face the same challenge as they try to embrace the data-driven future. Any business deploying smart sensors, IoT devices and predictive analytics will encounter similar issues.

Ever-increasing data volumes

The introduction of IoT devices has exponentially increased the volumes of data being generated. Each sensor can output multiple messages every second. Although small in size, each signal needs to be analysed and actioned immediately.

In most cases, sensor output is nothing more than ‘status ok’ type messages and can be safely ignored, and simply sent to archive storage. In fact, it may be perfectly reasonable to discard them entirely as they offer little long-term value.

Without rules that filter and direct this constant stream of information, businesses will see their data capacity requirements – and costs – escalate even faster than anticipated. The right information must be retained however, otherwise the results of your predictive analytics efforts will be unbalanced or incomplete.

The fundamental challenges you face

In order to succeed in a data-driven operating environment, your business needs to adapt to computing at the edge. You will need to address:

  • How to provide adequate processing power to deal with incoming data in real time.
  • How to specify storage for machine-generated information.
  • How to provide sufficient network bandwidth between the edge, data centre and cloud.

With the right infrastructure, these challenges can be overcome. And the benefits of edge computing are significant – you can read more in part 2 of this blog series next week.

Useful Links

Dr. Tom Bradicich, HPE, The Intelligent Edge: What it is, what it’s not, and why it’s useful

When OT and IT collide: Managing convergence on the industrial edge

IDC FutureScape: Worldwide IoT 2019 predictions (Analyst report)

2018 National Retail Security Survey (National Retail Federation report)

The merging of cybersecurity and operational technology (ISACA and ISA report)

Mining 24 hours a day with robots (MIT Technology Review)

Managed Service Provider

Top Tips: Choosing a Managed Service Provider

Choosing a Managed Service Provider is one of the most important decisions your business makes. Once you’ve entrusted your technology to an MSP, it will be difficult, time consuming and costly to move. We have put together these top tips to help you choose the right partner.

The right portfolio and scalability

A quick Google search can find multiple MSPs but finding one with the right portfolio that can meet your specific business needs, plus has the breadth of service portfolio to grow with you, can be more difficult. Are you looking for specific skill sets and a technology portfolio that will drive your business forward? WTL is an experienced Solaris and Linux specialist and can drive real transformation in your business. When coupled with a deep understanding of your business and your specific challenges these are hard to find qualities.

Provides SLAs that meet your requirements

When looking for an managed service provider, the SLAs they offer will be critically important. Do they match what you are looking for? What happens if the MSP doesn’t meet the SLA’s? Are there any financial incentives? It is important to choose an MSP who puts its money where its mouth is.

When negotiating a contract, is the MSP flexible and willing to consider something that isn’t standard? The level of flexibility that is evident at this stage could be an indicator of how accommodating the MSP will be once you’ve been onboarded.

Expertise, qualifications and accreditations

It might sound obvious, but a quality MSP needs to have the right qualifications and should keep these up to date. Don’t just look for standard accreditations, expect to see deep expertise in the latest technology; cloud, AI, virtualisation, mobility, security, networks, edge, analytics and more.

Accreditations demonstrate the MSP’s commitment and investment and should indicate that they are taking your technology needs seriously. WTL holds all the most current certifications for Solaris, is a Red Hat Ready partner, Oracle Gold Partner, an Enterprise Solution Provider for VMware, a Silver Veeam ProPartner, and many other leading technology accreditations.

Culture and values

Does the MSP share the culture, ethics and code of practice of your own business? If this is to a long-term, mutually respectful partnership, your MSP should hold the same values and be committed to helping you to achieve your goals.

You need a partner that will evolve with the wider marketplace, utilising the best technology and the best services for your business. Not one that will get comfortable and forget about innovation. WTL partners with the leading vendors and is always seeking out innovation that will drive real business benefits for you.

Meet the team, ask to see the company handbook, do some research on Glassdoor, and find out more about the culture of the partner you are considering.

Best Practice Policies and Procedures

Ensure the MSP utilises industry best practice across the organisation. MSPs that adhere to common frameworks ensure that the right processes, people and systems are in place to help you meet your business objectives. WTL holds the recognised ISO27001 and ISO9001 certifications for physical security and quality processes.

MSPs should be able to detail its processes and policies to you, providing full visibility and transparency.

Security

As with policies and procedures, cybersecurity concerns will be high on your list. Look for best practice frameworks and inspect security policies and procedures which cover monitoring, detection, incident logs, remediation, risk management, patch installation and incident response processes. Ask the MSP to demonstrate compliance with regulations and to ensure that your data will be stored in accordance with the security requirements that the industry demands (PCI DSS, HIPAA etc) and with wider data protection regulations.

WTLis a Cyber Essentials approved partner, demonstrating its commitment to an industry standard cybersecurity framework and offers customers a high level of systems and data security and governance.

Word of mouth

Ask to speak to at least one or two existing customers who share your business transformation goals and some common demographics or use your own network to verify the partner’s reputation and reliability. You’re looking for an honest appraisal, that you won’t find by looking at a brochure or website. There is no greater endorsement than a peer endorsement.

When you have satisfied all of the above, then you are ready to choose a managed service provider. WTL will provide references on request and can satisfy any due diligence questions you may have if you are looking for an experienced and trusted managed service partner.

Useful Links

Intercity Technology – Top 5 Tips for Choosing a Managed Service Provider

IBM : Top 10 tips for selecting a managed service provider

How to choose a Managed Services Provider: A 20 point checklist to choosing the right MSP for your business

To outsource or not to outsource IT

IT: To outsource or not to outsource?

Businesses choose to outsource their IT services for a variety of reasons, including accessing additional skills and expertise not held inhouse. Statista figures from 2018 identified that 46% of businesses across the globe that outsourced IT services did so to plug skills gaps. This is particularly relevant when a business needs specialist skills like Solaris or Linux, which can be hard to recruit and retain. 36% of those surveyed outsourced to save money and 35% wanted to free up resources to focus on their core business. 33% wanted to add scale to their business, 29% wanted to improve flexibility in the use of their resources and 10% did it to encourage and facilitate innovation.

As the statistics above show, outsourcing takes many forms. Some businesses bolster their existing team with additional resources and some outsource their entire IT operation, including technology and people. Depending on the business’ requirements, outsourcing can facilitate 24/7/365 cover, with stringent SLAs and no issues with staff holiday cover, sickness or transience.

Outsourcing can save businesses money in a number of ways. Businesses that outsource their infrastructure gain access to the latest technology and systems without the huge upfront investment. Costs are spread and paid on an OpEx basis rather than CapEx, which can make budgeting and future planning easier. The not insignificant HR costs associated with an in-house IT team are eliminated, NI, tax, sickness and holiday pay is the responsibility of the outsource partner.

Cybersecurity defences are stronger, with many managed service providers running security operations centres that monitor customer networks and systems continuously to predict and protect against threats.

With such compelling reasons to outsource, why do businesses keep their IT services inhouse?

For some businesses, the appeal of an inhouse IT team lies in the ability to get help from someone in person, in the moment, rather than at the end of the phone.

While some organisations appreciate the time that an inhouse team can take to focus on projects, or solutions, as they aren’t calling off hours from a monthly quota, others are aware that inhouse teams can be overworked and understaffed, constantly dealing with time-pressured “urgent” issues and never freeing up time to innovate or strategise. In addition, a common issue can arise when large amounts of critical knowledge are held by a single individual. Particularly prevalent with specialist skills around Solaris, Linux or even applications like Oracle databases, this “key man risk” is a serious flag on an operational risk register, mitigated by delegation, expanding the team, sharing responsibility with a different team, or by outsourcing.

As we have outlined the cost savings that can be achieved with outsourcing, hiring a team is expensive. Especially one that needs to incorporate a diverse range of skills and expertise to cover a range of operating systems like Windows, Solaris and Linux.

WTL has deep expertise in the world’s leading enterprise technology, including Oracle Solaris and Linux, employing some of the country’s most experienced engineers and architects to ensure that customers can take advantage of the technology, without worrying about training or skills shortages. WTL works closely with its customers to understand if and how outsourcing some of all of its IT resources can benefit their business, fitting in with existing teams where necessary. Every business is different, generating different types of data and will require different systems to meet its business needs. For some companies, especially those in a growth period, moving towards more complex systems, or approaching commercialization, outsourcing some responsibilities to a board level individual who can help drive strategies forward is the right solution

Outsourcing IT to an MSP can be a flexible, smart approach which allows a business to free up resources allowing them to focus on the important elements of growing the business.

If you are unsure whether outsourcing your technology operations is right for you, give WTL a call today.

Useful Links

Statista Global reasons to outsource 2019

Top 10 reasons to outsource

Advantages of outsourcing services

10 Benefits of working with a Managed Service Provider

Smart City Connected by Hyperconverged Infrastructure

The evolution of Hyperconverged Infrastructure – NetApp’s role in this expanding market

Enterprises and mid-market organisations alike are starting to realise the transformational benefits of Hyperconverged Infrastructure (HCI), where server, storage and networking resources are provided as a combined, modular block and managed by a single interface.

Analysts are predicting that adoption will continue to rise and a recent report by the Evaluator Group highlighted that acceptance and implementation of HCI by enterprise sized firms has increased, with 79% of large enterprises expanding their use of hyperconverged infrastructure and using it for mission-critical workloads.

Traditional data centres are set up with all their resource layers set up separately and often managed individually. Conversely, HCI brings together different resources; server, storage, and networking in a way that is simple to manage, allocate and consume.

So how else do businesses benefit from hyperconvergence? Many HCI users  report improved IT team productivity, a more agile business operation and greater ability to support a hybrid cloud environment, with cloud applications.

Businesses also report lower capex, as SAN-based storage solutions are replaced by industry standard servers and overprovisioning is a thing of the past. Resources can be added as and when they are needed to scale out.

Opex is also reduced, as less resources lead to less floor space, power and cooling consumption. The simplified and automated nature of HCI administration means that management overheads are lower, increasing staff productivity and allowing IT teams to do more with the same number of staff.

Risks are lowered as downtime is reduced during upgrades and system refreshes, which happen automatically. The supply chain is smaller and that inherently reduces the operational risks associated with vendor management.

Modern HCI solutions need to be able to provide predictable and guaranteed service levels for multiple primary workloads all competing for bandwidth. They must integrate with multiple public clouds, creating a seamless, hybrid multi-cloud with a common data fabric for private and public clouds. Whilst the essence of HCI is that all components are provided together, in reality, most organisations do not scale equally, with demands for compute, storage and networking increasing equally. So, modern HCI solutions should be able to scale the individual elements of the solutions independently in order to truly maximise the resousces. A storage intensive environment may not necessarily need additional compute power.

NetApp understands the demands of a modern HCI environment and entered the market quite recently with HCI solutions that have been born in the cloud, for the cloud. NetApp HCI offers workload protection, for multiple workloads, allowing organisations to consolidate many different applications on it, safe in the knowledge that those workloads are replicated, protected and available.

NetApp HCI allows organisations to add compute or storage nodes independently, which eliminates overprovisioning and ensures the HCI environment is flexible enough to meet any business’s needs.

NetApp Data Fabric provides consistent data services across on-premises infrastructure, public and private clouds allowing it to meet the needs of today’s businesses, as according to Flexera’s State of the Cloud Report for 2019, 84% of businesses have a multi-cloud strategy and 58% of businesses have a hybrid cloud strategy.

In terms of management, NetApp HCI offers a automated deployment engine which has reduced the number of deployment steps from 400 manual steps to just 30 highly automated steps. Similar automation features in the management console mean it benefits from highly automated integration into higher-level management, orchestration, backup, and DR tools

In short, for customers looking at HCI solutions with a view to transforming their business should absolutely consider NetApp. It has been designed to be future proofed and meets the brief of what a modern HCI solution should offer.

Useful Links

Hyperconverged Infrastructure adoption rates

What is hyperconvergence?

IT Pro Article – Five business benefits of hyperconvergence

IT Pro Article – What is driving the risk of hyperconverged infrastructure?

Hypercponverged.org –Hyperconverged Infrastructure Basics

Flexera State of the Cloud 2019