An Introduction to Edge Computing in the Era of Connected Devices

Bringing Data Processing Closer to Sources

November 18, 2024

In an era of explosive data growth, edge computing has emerged as a game-changing approach to handling real-time information efficiently. But what exactly is edge computing, and why is it becoming increasingly crucial in our data-driven world?


The Data Explosion

To understand the importance of edge computing, we must first grasp the scale of data generation in our digital age. In 2010, Eric Schmidt, former Google Chairman, noted:

"There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created every two days."

Fast forward to today, and the numbers are staggering. Recent reports estimate that a whopping 402.74 million terabytes of data are created worldwide each day. This exponential growth in data generation poses a significant challenge: How do we handle these massive amounts of data in real-time efficiently?

In traditional methods like cloud computing, data has to be sent to a central cloud for analysis, causing slight delay or latency in the system. To fix this, edge computing technology, a distributed computing system, was introduced.

What is Edge Computing?

Edge computing offers a solution by bringing data processing closer to the source. Unlike traditional cloud computing, which relies on distant data centers, edge computing processes data near where it's created—on local devices or nearby edge servers. This approach significantly reduces latency and speeds up response times, as data doesn't need to travel long distances for processing.

Key Benefits of Edge Computing:

  • Reduced latency
  • Improved response times
  • Decreased bandwidth usage
  • Enhanced data security and privacy

Edge computing is particularly valuable for applications requiring quick decisions, such as smart devices, autonomous vehicles, and industrial IoT systems.

Understanding Edge Computing Architecture

Components of Edge Computing Architecture

The Core Components of Edge Computing Architecture:

  • Edge Devices (IoT Devices)

    These are the primary data-generating IoT devices at the "edge" of the network. They include sensors, cameras, smart appliances, wearable devices, and other IoT-enabled gadgets. IoT edge devices collect real-time data from the environment and, in some cases, perform basic data processing locally.

    Example: In a smart home, thermostats and lighting systems are edge devices collecting temperature and occupancy data.

    Example: In a smart factory IoT application, industrial machines generating telemetry data are edge devices in the edge computing network.

  • Edge Gateways

    Edge gateways are intermediate devices between IoT edge devices and edge servers or the edge cloud. They act as local computing hubs that aggregate and preprocess data from multiple IoT edge devices. Edge gateways typically handle tasks like data filtering, compression, aggregation, and sometimes real-time analytics.

    Example: In a smart city setup, a gateway might collect data from traffic cameras and sensors at multiple intersections before forwarding relevant edge data for further processing.

  • Edge Servers (or Edge Nodes)

    Edge nodes are more powerful computing centers or server-grade devices located closer to the end-user or device, such as at a 5G small cell tower, factory floor, or building. These edge servers handle more intensive processing and analytics.

    Example: In manufacturing, an edge server helps analyze data from machines, performs predictive maintenance, and triggers actions if a problem is detected, such as adjusting machine settings or sending alerts.

  • Localized Data Storage

    Local storage on edge nodes or gateways where processed or filtered data is temporarily stored. This allows edge devices to store important data for quick access without needing to send all information to the edge cloud.

    Example: In retail, local storage on edge devices can temporarily store video footage from security cameras, enabling faster access for local data analytics before sending selected clips to the edge cloud.

  • Edge Orchestration and Management Platforms

    This component manages and orchestrates the edge infrastructure, providing centralized control over edge devices, edge gateways, and edge servers, and allowing IT teams to monitor, deploy, update, and manage resources efficiently.

  • Network Layer

    The network layer in the IoT architecture connects the edge devices, gateways, and IoT edge servers with the central edge cloud infrastructure, using 5G, Wi-Fi, Ethernet, Bluetooth, LoRaWAN, or other technologies.

    Example: 5G networks are crucial for low-latency communication in applications like autonomous vehicles.

  • Security Layer

    The security layer ensures secure communication, data integrity, and protection against cyberattacks. Security protocols must include data encryption, authentication, secure access control, and intrusion detection.

  • Application Layer

    The application layer consists of the actual applications running on edge devices, gateways, and servers. These applications are customized to process specific types of data, execute business logic, and take actions based on data analysis.

How Does Edge Computing Work?

Edge computing has a distributed computing architecture that moves data processing closer to the data source—typically near IoT devices, sensors, or end-user applications—instead of relying solely on centralized cloud servers. This approach allows data to be processed in real-time or near real-time, reducing latency, improving efficiency, and reducing the need to transmit large volumes of data to the cloud.

Now let's understand the step-by-step working involved in the edge computing networks.

1. Data Generation at the Edge

In an edge computing architecture, the system consists of IoT devices, sensors, cameras, actuators, and other connected edge devices that collect data. These edge devices can be deployed in various environments or edge computing applications like industrial machines, autonomous vehicles, smart homes, or cities.

The edge devices generate different types of data, including real-time sensor readings, video feeds, machine telemetry, and user interaction data. The volume of data can be massive, especially in cases like connected factories, smart cities, or healthcare monitoring systems.

2. Local Data Processing at the Edge

Data Processing

Edge nodes filter and preprocess raw data to remove irrelevant or redundant data at the edge. This step reduces the edge data load and optimizes bandwidth usage.

The edge node performs tasks such as data normalization, transformation, and compression. For example, a smart camera in a retail store might use object detection models to identify the number of customers, instead of streaming full video feeds to the data cloud.

Edge computing devices can also execute AI/ML models locally. These models might be trained in the data cloud but are deployed on edge devices for fast, real-time decision-making IoT applications.

3. Real-Time Analytics and Decision Making

Once the data is processed at the edge, decisions can be made immediately without waiting for instructions from a centralized server or cloud. This is critical for IoT edge computing applications like:

  • Autonomous vehicles: Edge data from LIDAR, cameras, and sensors is processed locally to make split-second decisions like braking, accelerating, or steering.
  • Smart manufacturing: Sensor data from equipment is processed in real-time to detect anomalies and predict failures, allowing for preventative action before costly downtime occurs. This is one of the advantages of edge computing in IoT applications.

Based on the edge computed data, edge computing systems can trigger immediate actions in various IoT edge computing environments such as:

  • Adjusting machine settings in real-time based on sensor data.
  • Sending alerts to operators/service providers when a threshold is crossed (e.g., detecting overheating in industrial equipment).
  • Controlling actuators in real-time, such as turning on a cooling system or adjusting a robotic arm.

4. Communication Between Edge and Cloud

Selective Data Transmission

Not all edge data needs to be sent to the data cloud. Edge computing network allows for selective data transmission, where only critical insights, summaries, or processed data are transmitted to the data cloud for further analysis or long-term storage, optimizing cloud storage costs.

Data Filtering and Aggregation

At the edge, data is filtered and aggregated for efficiency. For example, rather than sending continuous sensor readings, an edge node may send a summary of readings, such as average temperatures over time.

Offloading Complex Tasks

More computationally intensive tasks, such as machine learning model training or deep data analytics, are offloaded to the data cloud because edge devices generally have less processing power and storage than centralized cloud edge data centers.

The edge cloud still plays a crucial role in edge computing systems by offering:

  • Centralized Control: The edge cloud can act as the orchestrator, updating and managing edge devices, deploying machine learning models, or pushing software updates to the edge devices.
  • Data Analytics: While edge nodes process data in real-time, the edge cloud can perform large-scale, longer-term analytics, providing more complex insights and aggregating data from multiple edge nodes.

5. Edge-to-Edge Communication and Distributed Processing

Distributed Edge Computing

In larger distributed edge systems, multiple edge nodes may need to communicate with each other without involving the edge cloud. This distributed processing model enables horizontal scaling, where multiple edge nodes share and collaborate on data processing tasks.

Distributed edge nodes could be local servers, gateways, or edge devices. Instead of sending all the data to a single edge device or cloud, distributed edge nodes process the data independently at different points in the network. Each distributed edge node manages its local data, reducing the need to transmit large volumes of data to the cloud.

It also increases fault tolerance as if one edge node fails, other nodes can take over the processing workload, ensuring uninterrupted service. This improves data privacy and security, as sensitive information can be processed and stored locally at the edge.

Peer-to-Peer (P2P) Communication

Peer-to-Peer (P2P) Communication refers to a decentralized network model where edge devices (also called peers) communicate directly with each other without needing a central server to manage data exchange. Instead of routing data through a central server (like the cloud), edge devices or nodes communicate with each other directly. This allows them to exchange data or coordinate tasks more efficiently, without introducing the delays associated with cloud-based communication.

For example, in a smart city environment, multiple edge nodes responsible for traffic lights can communicate and coordinate decisions to optimize traffic flow based on real-time data from other intersections.

Collaborative Processing

Distributed edge nodes may share computational tasks for load balancing and redundancy. If one edge node is overwhelmed or fails, other nodes take over the processing load, providing fault tolerance.

Advantages of Edge Computing

Ultra-Low Latency

Edge computing technology reduces the time it takes for data to be processed because it happens close to the data source, eliminating the need for data to travel to distant cloud servers and back. It also impacts the efficiency of day-to-day operations, reducing network downtime and performance issues.

Bandwidth Efficiency

Edge computing reduces the amount of raw data that needs to be transmitted to the cloud. Instead, only the essential, processed data or insights are sent. By limiting the data traffic between the edge and the cloud, and filtering and preprocessing data at the edge, businesses reduce the need for high-capacity network infrastructure, leading to substantial savings in network costs.

Enhanced Performance

Edge computing enables distributed computing, where multiple edge devices work together to share computational loads. This improves overall system performance, as tasks can be offloaded between nodes based on capacity and need.

Improved Data Privacy and Security

By processing data locally, edge computing reduces the amount of sensitive information sent over the network, minimizing exposure to cyber threats during data transmission. This is especially relevant for industries handling sensitive data, such as healthcare or finance.

Enhanced Reliability and Resilience

Since edge devices can function independently of cloud connectivity, they offer greater reliability in situations where internet access is limited or unstable. If cloud connectivity is lost, edge devices can continue operating and making decisions locally.

Different Types of Computing Technologies used in IoT Environment

Different Types of Computing Technologies used in IoT Environment

At its core, edge computing is defined as a distributed computing framework that brings data processing and storage closer to the physical location or “the edge” where data is generated. This approach minimizes latency, optimizes bandwidth usage, and data security by processing data locally at the network edge. While earlier computing methods like cloud and fog computing are different from this approach.

Cloud Computing

Cloud Computing

Cloud computing emerged as a widely adopted technology in the mid-2000s, with the launch of services like Amazon Web Services (AWS) in 2006, marking the beginning of mainstream cloud adoption. Cloud computing works by collecting data from IoT devices and transmitting it to centralized cloud servers for processing, storage, and analysis. The cloud provides the scalability and computational power to handle large volumes of IoT data, enabling remote monitoring, advanced analytics, and long-term data storage. However, this approach can introduce latency and bandwidth issues, especially when real-time responses are required.

Fog Computing

Fog Computing

Fog computing was introduced by Cisco in 2012, fog computing emerged as an intermediary solution between cloud and edge computing, aiming to bring processing closer to data sources to reduce latency and bandwidth usage. Fog computing in IoT environments acts as an intermediary layer between IoT devices and the cloud, processing and analyzing data closer to the source of data generation. Instead of sending all the data directly to the cloud for analysis, fog computing distributes processing tasks to fog nodes, which are located near the IoT devices.

Comparison Between Edge Computing, Fog Computing and Cloud Computing

Feature/Aspect Edge Computing Fog Computing Cloud Computing
Data Processing Location Directly at or near the data source, e.g., IoT devices, sensors. Intermediate layer between edge and cloud (local nodes/gateways). Centralized remote data centers.
Latency 1-10 ms: Local processing enables near-instant responses (e.g., for autonomous vehicles or industrial automation). 10-100 ms: Data passes through fog nodes before cloud; ideal for semi-real-time tasks (e.g., smart city traffic management). 100 ms to 2 seconds: Data transmission to distant cloud servers introduces higher latency (e.g., for streaming or long-term analytics).
Bandwidth Usage Low: Typically <10 MB/s of data sent to cloud, as most processing is local. Only processed data is sent. Moderate: Around 50 MB/s. Fog nodes preprocess and filter data, but some data is still sent to the cloud. High: 100 MB/s to 1 GB/s of raw data, especially for video and high-frequency sensor readings.
Scalability Limited by local resources (device or edge server). Can scale by adding more edge nodes. Scalable via distributed fog nodes but still limited by individual fog node capacity. Highly scalable: Cloud can handle millions of devices and petabytes of data.
Network Dependency Low: Can operate independently of the cloud and internet (useful for remote areas or critical systems). Moderate: Fog nodes can handle local processing but still need some cloud connectivity. High: Requires reliable, high-speed internet for data transmission to cloud data centers.
Processing Power Moderate: Devices like NVIDIA Jetson, Raspberry Pi (1-5 TOPS for AI tasks). High: Fog nodes with Intel Xeon or similar processors (up to 10-100 TOPS for AI tasks). Very High: Cloud services like AWS, Google Cloud can offer TFLOPS or PFLOPS level processing power.
Real-Time Processing 1-10 ms for local, immediate decisions (e.g., machine control in industrial settings, autonomous driving). 10-100 ms for quick decisions in semi-real-time tasks (e.g., vehicle-to-everything communication). 100 ms to seconds: Unsuitable for real-time tasks but effective for long-term data processing (e.g., batch analytics).
Data Privacy and Security High: Local data processing reduces risk of exposure during transmission. Ideal for healthcare or financial data. Moderate: Some data is processed locally, but critical data might still be sent to the data cloud. Low: Data transmission over the internet to cloud increases exposure to security risks (e.g., healthcare records, financial transactions).
Data Storage Local storage, typically GBs on edge devices (e.g., 128 GB SD cards or 1 TB SSDs in gateways). Moderate storage: Typically 1-10 TB per fog node (e.g., in micro data centers or localized servers). Massive storage: Cloud storage offers petabytes to exabytes (e.g., Amazon S3, Google Cloud Storage).
Cost Efficiency High for localized data processing, reducing cloud storage and transmission costs. Edge devices are relatively inexpensive (e.g., $50-$500 per device). Moderate: Costs for hardware (fog nodes) plus cloud integration. Fog nodes are more expensive than edge devices ($1,000-$10,000). High cost due to data transmission, storage, and computing. Cloud resources are pay-per-use but can escalate with heavy usage.
AI/ML Inference Time <10 ms per inference (for small AI models, e.g., facial recognition on edge devices). 10-50 ms: Fog nodes can run more powerful AI models locally (e.g., predictive maintenance in industrial environments). 50 ms to several seconds: Cloud is ideal for training large AI models but adds latency for real-time inference tasks.
Resilience and Fault Tolerance Highly resilient: If one edge device fails, others continue operating independently (e.g., in a smart grid or factory). Moderate: Distributed fog nodes provide some redundancy but rely on cloud connectivity for coordination. Low: Cloud-based systems fail when cloud connectivity is lost, affecting all connected devices.
Energy Consumption Low: Most edge devices consume <1mW to 50W, making them energy-efficient for remote or battery-operated systems. Moderate: Fog nodes consume more power (typically 50W-200W) as they have more powerful processors and larger storage. High: Cloud data centers consume MWs of power, with large energy costs for cooling and data processing.
Management Complexity High: Managing and orchestrating thousands of edge devices and keeping them updated is challenging. Requires advanced orchestration tools. Moderate: Fog nodes are fewer in number than edge devices, but managing distributed nodes still requires a complex system. Low: Centralized cloud platforms handle software updates, scaling, and orchestration automatically.
Example Use Cases Autonomous vehicles, smart grids, industrial automation, real-time health monitoring. Smart cities, vehicle-to-everything (V2X) communication, agriculture, distributed industrial IoT. Big data analytics, machine learning training, enterprise IT, customer data storage.
Specific Technologies NVIDIA Jetson, Google Coral, Intel Movidius for AI inference at the edge. Cisco Fog Nodes, HPE Edgeline, Dell EMC PowerEdge for fog computing deployments. Amazon Web Services (AWS), Google Cloud, Microsoft Azure for cloud-based data processing and storage.

Popular Types of Edge Computing in Modern Networks

Multi-Access Edge Computing (MEC)

Multi-Access Edge Computing (MEC)

Also known as Mobile Edge Computing, Multi-Access Edge Computing (MEC) refers to the integration of edge computing capabilities into telecommunication networks. MEC was introduced around 2014-2015 by the European Telecommunications Standards Institute (ETSI) to primarily provide cloud computing capabilities at the edge of mobile networks, enabling low-latency applications. It allows data to be processed at or near cellular base stations, reducing latency and improving network efficiency.

MEC is integrated into mobile networks (particularly 4G and 5G) to deliver ultra-low latency for services such as video streaming, augmented reality (AR), and virtual reality (VR) in mobile devices and applications. MEC is also used in V2X communication, smart cities for traffic management, smart industries for predictive maintenance, and more advanced applications like connected healthcare or remote surgeries. MEC is tightly integrated with 5G NR and other mobile network technologies, leveraging their high-speed, low-latency characteristics to enhance IoT performance.

IoT Edge Computing

IoT Edge Computing focuses on processing data generated by IoT devices (sensors, cameras, actuators) directly at or near the IoT device itself, rather than sending all data to the cloud for processing. IoT Edge Computing enables real-time data analytics, decision-making, and actions for IoT-driven applications, reducing latency and improving efficiency. It enables faster processing and real-time decision-making, which is crucial for applications like smart manufacturing, healthcare monitoring, autonomous vehicles, and drones.

Communication Protocols in IoT Edge Computing

IoT devices in edge computing communicate using various protocols designed for fast, reliable, and secure data exchange. Common protocols include:

  • MQTT (Message Queuing Telemetry Transport) - A lightweight protocol for transmitting small data packets efficiently.
  • CoAP (Constrained Application Protocol) - Ideal for low-power, low-bandwidth devices.
  • HTTP/HTTPS - Often used for web-based communication.
  • Zigbee and Z-Wave - Common for short-range, low-power wireless IoT networks.
  • Bluetooth Low Energy (BLE) - Used for nearby device communication.

These protocols enable IoT devices to efficiently exchange data in edge environments.

Learn more about seamless communication in the IoT ecosystem.

Cavli Wireless Cellular IoT Modules

Closing Notes

Edge computing is a game-changer for businesses managing massive data from IoT and real-time systems. By processing data locally, it reduces latency, bandwidth usage, and enhances privacy and security, while enabling real-time decision-making. For businesses, this translates to significant cost savings, improved operational efficiency, and the ability to scale without overloading cloud infrastructure.

With real-time insights and faster responses, edge computing opens up new opportunities for innovation and revenue generation. In today’s data-driven world, adopting edge computing is essential for businesses to remain competitive and agile.


Go Beyond and Explore


Edge computing is a distributed computing model where data processing occurs closer to the data source, often on or near IoT devices themselves. Unlike traditional cloud architectures, which centralize data processing in remote data centers, edge computing minimizes latency by executing computations at the "edge" of the network, close to data sources. This setup is particularly advantageous for applications that require real-time processing and low-latency responses, such as autonomous vehicles, industrial automation, and video analytics. Edge computing offloads bandwidth-intensive tasks from cloud servers, reducing latency and enabling real-time data processing.

Yes, edge computing can operate independently of a constant internet connection, which is one of its core advantages. By processing and analyzing data locally on edge devices, it can continue to provide real-time insights and actions even in environments with intermittent or limited connectivity. This local autonomy is critical for use cases like remote oil rigs, autonomous vehicles, or field sensor networks in agriculture, where reliable connectivity may not be available. Data can be stored and processed locally and later synced with central systems once connectivity is restored, ensuring seamless operations.

Edge computing plays a crucial role in IoT by enabling faster, localized processing and decision-making on IoT devices themselves or on nearby gateways, thereby reducing the need to send all data to centralized cloud servers. This is essential in IoT ecosystems where latency, bandwidth, and data security are significant considerations. Through edge computing, IoT devices can process sensor data, trigger alerts, and even perform complex analytics independently. Integration with IoT devices often occurs via edge gateways, which aggregate and preprocess data from multiple sensors, allowing efficient data management and reduced latency in time-critical applications.

Edge computing has limitations, particularly concerning resource constraints on edge devices, which typically have limited storage, processing power, and memory compared to centralized data centers. In high-volume data applications, such as video processing or high-frequency industrial sensors, these resource limitations can restrict the scope and scale of data processing feasible on edge devices. Additionally, edge devices can present challenges related to security management, maintenance, and scalability, especially as the number of deployed devices grows. Effective edge architecture often requires a balance between processing at the edge and offloading certain tasks to cloud resources to handle more data-intensive operations.

Edge computing can enhance data security by limiting the amount of data that needs to be transmitted over networks to centralized cloud servers, thereby reducing exposure to potential threats during data transfer. By processing sensitive data locally, edge computing helps maintain data privacy, especially in industries with stringent compliance requirements, such as healthcare and finance. However, edge devices are often more vulnerable to physical security risks, unauthorized access, and cyberattacks due to their distributed nature and remote locations. To ensure robust security, edge implementations require comprehensive encryption protocols, hardware-based security, and secure communication channels between edge devices and central servers.

An edge gateway in IoT architectures is a specialized device that connects edge devices and sensors with cloud or central systems, serving as an intermediary for data aggregation, preprocessing, and protocol translation. These gateways facilitate efficient data transmission by filtering, compressing, and organizing raw data from multiple sensors before forwarding only the relevant information to cloud platforms. Edge gateways are equipped with networking capabilities to support various IoT protocols (such as MQTT, HTTP, and CoAP) and may also include local storage and processing power for limited analytics. By acting as a central hub, edge gateways improve network efficiency, reduce latency, and enhance overall system resilience.


Authors

Drishya Manohar

Drishya Manohar

Sr. Associate - Content Marketing
Cavli Wireless


Related Blogs

What is 5G NR?
Understanding IoT Architecture in IoT Systems

Leave a comment

Note: if you need support, please contact our support team and do not use the comment form. Your email address will not be published.

Meet Our Solution Consulting Team

Still, trying to understand? Book a meeting with our solution consulting team to get you to start your IoT journey in most seamless way