The rapid adoption of Internet of Things (IoT) technologies in smart agriculture has intensified the need
for efficient computing architectures capable of supporting real-time monitoring and decision-making.
This study presents a comparative analysis of edge computing and cloud computing architectures in
IoT-based smart agriculture systems, focusing on key performance metrics including latency, bandwidth
utilization, energy consumption, system throughput, and reliability. A controlled experimental setup was
implemented using identical sensor configurations across both architectures to ensure a fair evaluation.
The results demonstrate that edge computing significantly reduces latency and bandwidth usage while
improving energy efficiency and throughput compared to cloud-centric architectures, making it more
suitable for time-sensitive agricultural applications. Cloud computing, however, remains effective for
centralized data storage and large-scale analytics. The findings highlight the trade-offs between the two
paradigms and emphasize the potential of hybrid edge–cloud architectures as a balanced solution for
scalable and responsive smart agriculture deployments.