As mentioned before, edge computing pushes applications, data and computing power (services) away from centralized points to the logical extremes of a network. Edge computing replicates fragments of information across distributed networks of web servers, which may be vast and include many networks. As a topological paradigm, edge computing is also referred to as mesh computing, peer-to-peer computing, autonomic (self-healing) computing, grid computing, and other names implying non-centralized, node-less availability.
To ensure acceptable performance of widely-dispersed distributed services, large organizations typically implement edge computing by deploying Web server farms with clustering. Previously available only to very large corporate and government organizations, technology advancement and cost reduction for large-scale implementations have made the technology available to small and medium-sized business.
The target end-user is any Internet client making use of commercial Internet application services.
Edge computing imposes certain limitations on the choices of technology platforms, applications or services, all of which need to be specifically developed or configured for edge computing.
Edge computing has many advantages:
Edge application services significantly decrease the data volume that must be moved, the consequent traffic, and the distance the data must go, thereby reducing transmission costs, shrinking latency, and improving quality of service (QoS).
Edge computing eliminates, or at least de-emphasizes, the core computing environment, limiting or removing a major bottleneck and a potential point of failure.
Security is also improved as encrypted data moves further in, toward the network core. As it approaches the enterprise, the data is checked as it passes through protected firewalls and other security points, where viruses, compromised data, and active hackers can be caught early on.
Finally, the ability to "virtualize" (i.e., logically group CPU capabilities on an as-needed, real-time basis) extends scalability. The edge computing market is generally based on a "charge for network services" model, and it could be argued that typical customers for edge services are organizations desiring linear scale of business application performance to the growth of, e.g., a subscriber base.
As mentioned before, edge computing pushes applications, data and computing power (services) away from centralized points to the logical extremes of a network. Edge computing replicates fragments of information across distributed networks of web servers, which may be vast and include many networks. As a topological paradigm, edge computing is also referred to as mesh computing, peer-to-peer computing, autonomic (self-healing) computing, grid computing, and other names implying non-centralized, node-less availability.To ensure acceptable performance of widely-dispersed distributed services, large organizations typically implement edge computing by deploying Web server farms with clustering. Previously available only to very large corporate and government organizations, technology advancement and cost reduction for large-scale implementations have made the technology available to small and medium-sized business.The target end-user is any Internet client making use of commercial Internet application services.Edge computing imposes certain limitations on the choices of technology platforms, applications or services, all of which need to be specifically developed or configured for edge computing.Edge computing has many advantages:Edge application services significantly decrease the data volume that must be moved, the consequent traffic, and the distance the data must go, thereby reducing transmission costs, shrinking latency, and improving quality of service (QoS).Edge computing eliminates, or at least de-emphasizes, the core computing environment, limiting or removing a major bottleneck and a potential point of failure.Security is also improved as encrypted data moves further in, toward the network core. As it approaches the enterprise, the data is checked as it passes through protected firewalls and other security points, where viruses, compromised data, and active hackers can be caught early on.Finally, the ability to "virtualize" (i.e., logically group CPU capabilities on an as-needed, real-time basis) extends scalability. The edge computing market is generally based on a "charge for network services" model, and it could be argued that typical customers for edge services are organizations desiring linear scale of business application performance to the growth of, e.g., a subscriber base.
การแปล กรุณารอสักครู่..