top of page

Edge Computing

Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed, i.e., the edge of the network. In edge computing, data processing and analysis are performed locally on devices such as sensors, gateways, or edge servers, rather than relying solely on centralized data centers or cloud computing resources.


Here's how edge computing works:


Data Processing at the Edge:

Instead of sending all data to a centralized cloud server for processing, edge devices process data locally in real-time or near real-time. This reduces latency and bandwidth usage by minimizing the need to transmit large amounts of data over the network.


Local Storage and Computation:

Edge devices have their computational resources and storage capabilities, allowing them to perform tasks such as data filtering, aggregation, analysis, and even machine learning inference locally. This enables faster response times and greater autonomy for edge devices.


Edge Servers and Gateways:

In some cases, edge computing involves deploying intermediate edge servers or gateways between edge devices and the cloud. These edge servers can aggregate data from multiple devices, perform more complex processing, and act as a bridge between edge devices and the cloud.

Learn more AI terminology

Federated Learning

Deep learning

Prompt engineering

Generative AI

Generative Pre-trained Transformer(GPT)

Natural language processing(NLP)

Machine learning

bottom of page