Shares

Investment in Kenya’s digitalisation and continued rollout of ICT infrastructure has enabled local enterprises to properly consider the possibility of edge computing. Distributed IT architecture, in which data is processed at the edges of the network, offers a lot of opportunities to businesses, regardless of whether said network is limited to a single factory floor or spread across dozens of facilities. One 2023 report from Accenture found edge computing will accelerate innovation and lead to new revenue opportunities, with 83% of surveyed C-suite executives from around the world agreeing it will be essential to remain competitive.

While every organisation may have the ambition to take its infrastructure to the edge, the journey can be hampered by challenges ranging from technical issues to deployment restrictions. Therefore, enterprises need to know what they want out of their edge infrastructure, understand the challenges standing in their way, and provide solutions for overcoming them. Then, there’s potential for a new kind of computing to set the standard for IT operations for years to come.

Cutting edge

The goal of edge deployments is simple: to bring computation capabilities and data storage closer to the location where it is needed. Indeed, edge computing has the potential to transform entire industries, including healthcare, retail, manufacturing, and critical infrastructure. For example, edge computing is playing a critical role in transforming Africa’s banking sector by improving access to customer services, quicker data transfer, and facilitating rapid authorisation checks.

Every edge deployment is different and is subject to the IT architecture and use case of the organisation. However, edge deployments can be categorised into three groups

  • Enterprise edge: Use cases that feature an enterprise data store at their core and extend application services to remote locations. For example, a retail franchise can use the edge to offer new services and improve in-store experiences.
  • Operations edge: Use cases that concern industrial edge devices overseen by operational technology (OT) teams, and that gather, process, and act on data on-site. Using IoT sensors, a factory can analyse its floor conditions and enhance its operational efficiency.
  • Provider edge: Use cases that involve building out networks and offering services delivered with them e.g. a telecommunications provider. Provider edge supports low latency, with computing environments close to devices and customers.

A major benefit of edge computing, one that may not be immediately apparent, is its utility in abiding by data sovereignty and national regulatory guidelines. By storing and processing data at the source, enterprises can keep it within the relevant borders and obscure any sensitive data before it is sent to the cloud, which could be located on the other side of those borders.

Reaching out to the edge

Going to the edge is, in essence, a move towards decentralisation. An organisation’s infrastructure is no longer restricted to a single geographical location or facility, and is instead spread across multiple locations, facilities, and areas. In light of this, the biggest challenge enterprises may face is connectivity. Latency and bandwidth limitations can hamper deployments. To overcome this, enterprises can leverage solutions such as edge caching, content delivery networks, and redundancy mechanisms that support offline operation and data processing. 

Enterprises may also face interoperability issues. With so many diverse hardware and software platforms, enterprises need a computing stack that can support multiple elements. Open hybrid cloud is the solution here as it enables the interoperability and management tools necessary to operate and scale edge deployments effectively. 

On the topic of scalability, enterprises may find a lack of adequate resources constraining their ability to grow their networks. Edge orchestration frameworks that distribute and balance workloads across devices remove that obstacle, while also offloading tasks to more powerful assets and letting devices prioritise important computations.

A new kind of edge

No discussion about edge computing is complete without the mention of artificial intelligence (AI). The marriage of edge computing and AI is an optimal one, as the goal of edge computing is not just to extend cloud environments to data sources and users, but also to deliver insights where and when they are needed.

Edge AI, or the implementation of AI/ML systems in an edge computing environment, accelerates the decision-making process and results in real-time computational feedback, all thanks to running models on the platform itself rather than on the backend cloud system. Edge devices running AI models can act on analytics immediately, all while using less power and bandwidth and offering increased levels of privacy and security through its handling of data.

Edge computing ecosystems are still in the process of maturing, and so, enterprises in Kenya are inclined to be extra diligent when it comes to choosing platforms, tools, and other solutions. But diligence begins by considering the possibilities of what edge deployments can do for your business, and expecting what challenges may lie in wait. From there, it’s up to you to make the leap to the edge. 

By Christopher Saul, Territory Sales Lead for East Africa at Red Hat