The edge is being sold to enterprise customers from just about every part of the technology industry, and there\u2019s not always a bright dividing line between \u201cpublic\u201d options \u2013 edge computing sold as a service, with a vendor handling operational data directly \u2013 and \u201cprivate\u201d ones, where a company implements an edge architecture by itself.\nThere are advantages and challenges to either option, and which is the right edge-computing choice for any particular organization depends on their individual needs, budgets and staffing, among other factors. Here are some considerations.\nChallenges of in-house edge computing\nThe IT-centric approach to edge keeps ownership of edge devices in-house and is likely to appeal to businesses with either strict legal requirements about where their data can be at any given time \u2013 a healthcare provider would be a good example \u2013 or a low level of institutional comfort for putting that data in the hands of third parties, like utility and manufacturing companies.\nHandling things in-house can be challenging, however. For one thing, according to Christian Renaud, IoT practice director for 451 Research, the fact of the matter is that many IT shops lack the requisite expertise to handle an edge deployment on their own.\n\u201cWe run into a few use cases where the internal IT team can\u2019t handle the edge infrastructure, so handing it off to a vendor makes a lot of sense,\u201d he said. \u201cThe challenge is that, with production systems, that\u2019s a whole different ballgame [than IT], so there\u2019s a pretty strict set of requirements in terms of what the OT vendors will let run on other people\u2019s networks.\u201d\nThe lack of common standards in edge compute limits customers\u2019 ability to build their edge infrastructure using multiple vendors. An organization might not be able to use one vendor\u2019s sensors without also buying its edge compute modules or networking gear, since they\u2019re all part of the same offering.\nForrester vice president and principal analyst Brian Hopkins contrasts edge to cloud computing, where interoperability, open frameworks and containerization makes these concerns all but irrelevant.\n\u00a0\u201c[Many cloud frameworks] don\u2019t have to worry about platforms or standards or anything, but when you move to the edge, all that abstraction doesn\u2019t exist,\u201d he said. \u201cSo you have to worry about what server you\u2019re running, what communication protocols you\u2019re using \u2026 it\u2019s hugely complicated.\u201d\nStill, single-vendor edge infrastructure can have feature advantages. For example, Cisco\u2019s edge intelligence orchestration software, which runs on its networking equipment and is managed remotely by Cisco, can send only the data that Cisco needs to run the software, not the operational data itself, Renaud said that. Hence, a user could utilize the software to operate an automated factory, but never have specific data about its machines leave its own networks.\nEdge services can lead to lock-in\nThis is the option that many vendors want to provide, since there\u2019s more functionality to offer, and thus more that they can ask customers to pay for. Handing off an edge-compute deployment to a vendor has the advantages of predictable costs\u2014just pay set installments for the service rather than budget for and implement a complex new computing system whose ultimate expense could grow unpredictably. Outsourcing can also simplify operational responsibility since it\u2019s the vendor\u2019s job to keep things running.\nUltimately it\u2019s the direction in which a lot of companies are likely to go, according to Accenture North America networking lead Peters Suh.\n\u201cLike many other technology-related decisions, cost of ownership, sufficient technical resources and competencies, and whether there is a strategic value of owning the network-edge stack will factor in this decision process,\u201d he said. \u201cHowever, over the long haul, most enterprises will likely look for third-party support.\u201d\nOf course, this also means that vendor lock-in is very much in play. Take, for example, a connected factory using a third-party service to instrument and orchestrate its machinery. The provider deploys its own sensors, networking equipment, edge boxes for local control and fast analysis, and feeds everything to a back-end that the customer can view for deeper insights.\nIf the factory owner then wants to change even one piece of the puzzle\u2014say, more efficient sensors with new capabilities\u2014it could upset the entire ecosystem and necessitate either a wholesale switch to a new vendor or an awkward, complicated implementation process to guarantee compatibility between the new sensors and everything else.\nAccording to Renaud, that\u2019s changing, at least to some extent. Even quite recently, operational-technology vendors would largely dictate the terms of all their deployments. If customers wanted an edge deployment, they had to accept exactly what the vendor had to offer.\n\u201cSo now it\u2019s more 50\/50, where everyone sits down at the table to decide where the data\u2019s gonna go, what the security\u2019s gonna be like, to get the desired OT outcome,\u201d he said. \u201cThe challenge right now is that so much of the orchestration is dictated by the workload and the vendor in production environments.\u201d\nIt\u2019s hard to know what edge means\nGiven the constellation of different products and services billed as edge computing \u2013 and a pervasive lack of agreement on a vendor-independent definition of the term \u2013 it can be a chore just to nail down whether a given solution is \u201cedge\u201d at all, much less \u201cin-house\u201d or \u201cedge as a service.\u201d\nOne solution might use private 5G or LTE for the networking piece but keep data entirely on a customer\u2019s servers. Another might use carrier connectivity to move data from a data center to private cloud or to a different provider\u2019s cloud. Still others outsource the entire operational tech stack to a vendor that provides the sensors, edge hardware, networking and compute and offer customers a dashboard through which they can view all the information they need.\nAll of these involve very different technologies with a wide range of suitable use cases, yet all are sold as \u201cedge computing.\u201d According to carriers, the \u201cedge\u201d is the edge of the network. Per Hopkins, the carriers, for years, have mostly made their revenue selling simple connectivity of one sort or another, and view edge computing as a great way to introduce over-the-top services, like management for numerous kinds of edge infrastructure, as a value-add.\n\u201cSo if you\u2019re an advertiser or marketer, they\u2019re saying, \u2018If you want to install applications for routing ads to local customers, our infrastructure is a place to do that,\u2019\u201d Hopkins said.\nSimilarly, content-delivery networks like Fastly and Akamai are looking at their numerous global points-of-presence, traditionally use to stage in-demand data, and seeing them as an opportunity to branch out. Since one of edge computing\u2019s hallmarks is providing services with very low latency, and that low latency has been the CDNs\u2019 core selling point for years, the Akamais and Fastlys of the world are eager to sell themselves as POPs for the edge. If a customer can figure out the connectivity piece, data processing can be done nearby in one of those POPs as a service. For example, the advertiser in Hopkins\u2019 example could use a CDN as a clearinghouse for location-dependent ad serving.