A Survey on Fog Computing Intelligence used for IoT Environments

DOI : 10.17577/IJERTCONV5IS22034

Download Full-Text PDF Cite this Publication

Text Only Version

A Survey on Fog Computing Intelligence used for IoT Environments

Asha Rani M

Assistant Professor, Department of CSE, GSSSIETW, Mysuru

Dr. S. Meenakshi Sundaram

Professor & Head, Department of CSE GSSSIETW, Mysuru

Abstract: Fog computing is new buzz word in computing world after cloud computing. This new computing paradigm could be seen as an extension to cloud computing. Main aim of fog computing is to reduce the burden on cloud by gathering workloads, services, applications and huge data to near network edge.The Internet of Things (IoT) is driving a digital transformation in all aspects of our lives and businesses. The growing number of connected devices is creating data at an exponential rate. As the Internet of Things evolves into the Internet of Everything and expands its reach into virtually every domain, high-speed data processing, analytics and shorter response times are becoming more necessary than ever. Meeting these requirements is somewhat problematic through the current centralized, cloud-based model powering IoT systems. but can be made possible through fog computing, a decentralized architectural pattern that brings computing resources and application services closer to the edge fog computing need for bringing the advantages and power of cloud computing closer to where the data is being generated . Fog computing reduces the amount of data that is transferred to the cloud for processing and analysis.

Keywords: Cloud, Fog, IOT

I. INTRODUCTION

The Internet of things (IoT) will be the Internet of future, as we have seen a huge increase in wearable technology, smart grid, smart home/city, smart connected vehicles. International Data Corporation (IDC) has predicted that in the year of 2015, the IoT will continue to rapidly expand the traditional IT industry up 14% from 2014 [1]. Since smart devices are usually inadequate in computation power, battery, storage and bandwidth, IoT applications and services are usually backed up by strong server ends, which are mostly deployed in the cloud, since cloud computing is considered as a promising solution

to deliver services to end users and provide applications with elastic resources at low cost.

However, cloud computing cannot solve all problems due to its own drawbacks. Applications, such as real time gaming, augmented reality and real time streaming, are too latency sensitive to deploy on cloud. Since data centers of clouds are located near the core network, those applications and services will suffer unacceptable round-trip latency, when data are transmitted from/to end devices to/from the cloud data center through multiple gateways. Besides this, there are also problems unsolved in IoT applications that usually require mobility support, geo-distribution and

location-awareness. The latest trend of computing paradigm is to push elastic resources such as computation and storage to the edge of networks, which motivates the promising computing paradigm of fog computing as a result of prevalence of ubiquitously connected smart devices relying on cloud services. Fog computing keeps data and computation close to end users at the edge of network, and thus provides a new breed of applications and services to end users with low latency, high bandwidth, and location-awareness, and thus gets the name as fog is analogously a cloud close to the ground [2]. We call those facilities or infrastructures providing resources at the edge of the network fog nodes. Besides resource-rich servers, fog

Fog More Computation power more data storage More responsive More Mobility Less delay.

fog computing has several unique properties that distinguish it from other existing computing architectures. The most important is its close distance to end users. It is vital to keep computing resource at the edge of the network to support latency-sensitive applications and services. Another interesting property is location-awareness; the geo-distributed fog node is able to infer its own location and track end user devices to support mobility. Finally, in the era of big data, fog computing can support edge analytics and stream mining, which can process and reduce data volume at a very early stage, thus cut down delay and save bandwidth.

II LIMITATIONS AND CHALLENGES OF CLOUD COMPUTING

Security and Privacy: Data security is major issues related

with personal data and confidential data of organizations. User has to completely depend upon the cloud service provider for their data privacy and security.

Technical Issues: High speed internet connectivity requirement makes the system complex. Various technical issues arise during high load.

Data lock-in: The lack of standard APIs restricts the migration of applications and services between clouds. With the rise of cloud the problems of Data portability, migration and vendor lock-in situation will increase.

Data segregation: Mostly Data segregation problem arises in the multi-tenant usage mode, where the different users

virtual machines are co-located on the same hard disks or same server. Here the risks includes to properly separate storage or memory between different users.

Data location: The geographic location of the data is also very important to secure the data and information of client. Rules and regulation for certain types of data is different in the different countries. A customer could be involved in illegal issues without even noticing.

Recovery and back-up: Data protection and recovery is an important aspect of cloud .some times in disaster situations recovery process is quite slow.

However, despite its power, the cloud model is not applicable to environments where operations are time- critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents cant afford the latency caused by the roundtrip to the cloud server.

Capitalizing on the IoT requires a new kind of infrastructure. Todays cloud models are not designed for the volume, variety, and velocity of data that the IoT generates. Billions of previously unconnected devices are generating more than two exabytes of data each day. An estimated 50 billion things will be connected to the Internet by 2020. Moving all data from these things to the cloud for analysis would require vast amounts of Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, Handling the volume, variety, and velocity of IoT data requires a new computing Model. The main requirements are to:

Minimize latency: Milliseconds matter when you are trying to prevent manufacturing line shutdowns or

restore electrical service. Analyzing data close to the device that collected the data can make the difference between averting disaster and a cascading system failure.

Conserve network bandwidth: Offshore oilrigs generate 500 GB of data weekly. Commercial jets

generate 10 TB for every 30 minutes of flight. It is not practical to transport vast amounts of data from

thousands or hundreds of thousands of edge devices to the cloud. Nor is it necessary, because many

critical analyses do not require cloud-scale processing and storage.

Address security concerns: IoT data needs to be protected both in transit and at rest. This requires Monitoring and automated response across the entire attack Operate reliably: IoT data is increasingly used for decisions afecting citizen safety and critical infrastructure. The integrity and availability of the infrastructure and data cannot be in question.

Collect and secure data across a wide geographic area with different environmental conditions:

IoT devices can be distributed over hundreds or more square miles. Devices deployed in harsh environments such as roadways, railways, utility field substations, and vehicles might need to be ruggedized. That is not the case for devices in controlled, indoor environments.

Figure 1. Connecting More and Different Kinds of Things Directly to the Cloud Is Impractical

Move data to the best place for processing:

Which place is best depends partly on how quickly a decision is needed. Extremely time-sensitive decisions should be made closer to the things producing and acting on the data. In contrast, big data analytics on historical data needs the computing and storage resources of the cloud.

Traditional cloud computing architectures do not meet all of these requirements. The prevailing approachmoving all data from the network edge to the data center for processingadds latency. Traffic from thousands of devices soon outstrips bandwidth capacity. Industry regulations and privacy concerns prohibit offsite storage of certain types of data. In addition, cloud servers communicate only with IP, not the countless other protocols used by IoT devices. The ideal place to analyze most IoT data is near the devices that produce and act on that data. We call it fog computing.

III HOW FOG COMPUTING PUSHES IOT INTELLIGENCE TO THE EDGE

IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time. The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.

A study by IDC estimates that by 2020, 10 percent of the worlds data will be produced by edge devices. This will further drive the need for more efficient fog computing

solutions that provide low latency and holistic intelligence simultaneously.

Fog computing has its own supporting body, the OpenFog Consortium, founded in November 2015, whose mission is to drive industry and academic leadership in fog computing architecture. The consortium offers reference architectures, guides, samples and SDKs that help developers and IT teams understand the true value of fog computing.

Already, mainstream hardware manufacturers such as Cisco, Dell and Intel are teaming up with IoT analytics and machine learning vendors to deliver IoT gateways and routers that can support fog computing. An example is Ciscos recent acquisition of IoT analytics company ParStream and IoT platform provider Jasper, which will enable the network giant to embed better computing capabilities into its networking gear and grab a bigger share of the enterprise IoT market, where fog computing is most crucial. Analytics software companies are also scaling products and developing new tools for edge computing. Apache Spark is an example of a data processing framework based on the Hadoop ecosystem that is suitable for real-time processing of edge-generated data.

The fog extends the cloud to be closer to the things that produce and act on IoT data (Figure 2). These devices, called fog nodes, can be deployed anywhere with a network connection: on a factory floor, on top of a power pole, alongside a railway track, in a vehicle, or on an oil rig. Any device with computing, storage, and network connectivity can be a fog node. Examples include industrial controllers, switches, routers, embedded servers, and video surveillance cameras.

IDC estimates that the amount of data analyzed on devices that are physically close to the Internet of Things is approaching 40 percent.1 There is good reason: analyzing IoT data close to where it is collected minimizes latency. It offloads gigabytes of network traffic from the core network, and it keeps sensitive data inside the network. Analyzing IoT data close to where it is collected minimizes latency. It offloads gigabytes of network traffic from thecore network. And it keeps sensitive data inside the network.

Figure 2. The Fog Extends the Cloud Closer to the Devices Producing Data.

IV THE KEY ADVANTAGES OF FOG COMPUTING

Keeping the Data close to the user: To eliminate the delays in data transfer fog allows keeping the data close to the user instead of store them in a far datacenters.

Dense geographical distribution : Fog computing creates an edge networks which sits at various points to extends the direct cloud services dense, geographically isolated infrastructure helps to handle and analyze big data faster, traversal to the entire WAN can be restricted because the administrators are able to support location-based mobility demands .

Great support for mobility: There is a tremendous increase in the amount of data and devices. A fog system supports to handle this large data and information and provides a better and faster way to access and analyze the data.

Save storage space: Fog computing would be a great option to prevent inappropriate or irrelative information to travel to the overall network; this will save storage space and reduces the delay.

Support for IOT: Fog computing scenario can be applied in a number of areas where Internet of Things is present. It may plays an important role in various application in IOT Like- in the working of Smart Traffic Lights and Vehicles, smart grids, smart cities , Wireless Sensor network and Actuator and Cyber-physical systems .

Extension of cloud and integration with other services : Fog computing could not be considered as the replacement of the cloud, this is a kind of extension to provide filtered and faster inputs to the cloud and users.

What Happens in the Fog and the Cloud Fog nodes:

Receive feeds from IoT devices using any protocol, in real time Run IoT-enabled applications for real-time control and analytics, with millisecond response time

Provide transient storage, often 12 hours Send periodic data summaries to the cloud The cloud platform:

  • Receives and aggregates data summaries from many fog nodes

  • Performs analysis on the IoT data and data from other sources to gain business insight

  • Can send new application rules to the fog nodes based on these insights

V BENEFITS OF FOG COMPUTING

Extending the cloud closer to the things that generate and act on data benefits the business in the following ways:

Greater business agility: With the right tools, developers can quickly develop fog applications and deploy them where needed. Machine manufacturers can offer MaaS to their customers. Fog applications program the machine to operate in the way each customer needs.

Better security: Protect your fog nodes using the same policy, controls, and procedures you use in other parts of your IT environment. Use the same physical security and cybersecurity solutions.

Deeper insights, with privacy control: Analyze sensitive data locally instead of sending it to the cloud for analysis. Your IT team can monitor and control the devices that collect, analyze, and store data.

Lower operating expense: Conserve network bandwidth by processing selected data locally instead of sending it to the cloud for analysis.

VI CONCLUSION

Fog computing gives the cloud a companion to handle the two exabytes of data generated daily from the Internet of Things. Processing data closer to where it is produced and needed solves the challenges of exploding data volume, variety, and velocity. Fog computing accelerates awareness and response to events by eliminating a round trip to the cloud for analysis. It avoids the need for costly bandwidth additions by ffloading gigabytes of network traffic from the core network. It also protects sensitive IoT data by analyzing it inside company walls. Ultimately, organizations that adopt fog computing gain deeper and faster insights, leading to increased business agility, higher service levels, and improved safety.

REFERENCES

  1. Manreet kaur, Monika Bharti Fog Computing Providing Data Security: A Review,in International Journal of Advanced Research in Computer Science and Software Engineering, Volume 4, Issue 6, June 2014.

  2. Mohamed Firdhous, Osman Ghazali and Suhaidi Hassan Fog Computing: Will it be the Future of Cloud Computing?,in Proceedings of the Third International Conference on Informatics & Applications, Kuala Terengganu, Malaysia, 2014

  3. Ivan Stojmenovic, Sheng Wen, The Fog Computing Paradigm: Scenarios and Security Issues, Proceedings of the 2014 Federated Conference on Computer Science and Information Systems pp. 18

  4. F.Bonomi, R.Milito, J.Zhu, and S.Addepalli, "Fog computing and its role in the Internet of Things," in ACM SIGCOMM Workshop on Mobile cloud Computing, Helsinki, Finland, 2012, pp. 13–16.

  5. D.Kovachev, "Mobile multimedia services in the cloud," Ph.D. dissertation, RWTH Aachen University, Aachen, Germany, 2014

  6. Gil Press, Idc: Top 10 technology predictions for 2015, http://goo.gl/zFujnE, 2014, [Online; accessed 18-June-2015].

  7. F. Bonomi, R. Milito, J. Zhu, and S. Addepalli, Fog computing and its role in the internet of things, in workshop on Mobile cloud computing. ACM, 2012.

  8. S. Yi, Z. Qin, and Q. Li, Security and privacy issues of fog computing: A survey, in International Conference on Wireless Algorithms, Systems and Applications (WASA), 2015.

  9. M. Satyanarayanan, P. Bahl, R. Caceres, and N. Davies, The case for vm-based cloudlets in mobile computing, Pervasive Computing, 2009.

  10. ETSI, Mobile-edge computing, http://goo.gl/7NwTLE, 2014, [Online; accessed 18-June-2015].

  11. H. T. Dinh, C. Lee, D. Niyato, and P. Wang, A survey of mobile cloud computing: architecture, applications, and approaches, WCMC, 2013.

  12. L. M. Vaquero and L. Rodero-Merino, Finding your way in the fog: Towards a comprehensive definition of fog computing, ACM SIGCOMM CCR, 2014.

  13. J.K. Zao, T.T. Gan, C.K. You, C.E. Chung, Y.T. Wang, S.J.R. Mendez, T.Mullen, C.Yu, C.Kothe, C.T. Hsiao, S.L. Chu, C.K. Shieh, and T.P. Jung, "Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology," Frontiers in Human Neuroscience, vol.8, no. 370, pp. 1–16, 2014.

  14. Archer, Jerry, et al. "Top threats to cloud computing v1. 0." Cloud Security Alliance (2010).

  15. F. Bonomi, Connected vehicles, the internet of things, and fog computing,in The Eighth ACM International Workshop on Vehicular Inter-Networking (VANET), Las Vegas, USA, 2011.

  16. Cisco, Cisco delivers vision of fog computing to accelerate value frombillions of connected devices, Cisco, Tech. Rep., Jan. 2014.

  17. K. Hong, D. Lillethun, U. Ramachandran, B. Ottenwälder, and B. Koldehofe,Opportunistic spatio-temporal event processing for mobile situationawareness, in Proceedings of the 7th ACM International Conferenceon Distributed Event-based Systems, ser. DEBS13. ACM, 2013,pp. 195206.

  18. H. Madsen, G. Albeanu, B. Burtschy, and F. Popentiu- Vladicescu,Reliability in the utility computing era: Towards reliable fog computing,in Systems, Signals and Image Processing (IWSSIP), 2013 20thInternational Conference on, July 2013, pp. 4346.

Leave a Reply