Browse Prior Art Database

An Endpoint Analytics Processor for Providing Efficient Endpoint Analytics Services Disclosure Number: IPCOM000238931D
Publication Date: 2014-Sep-25
Document File: 5 page(s) / 111K

Publishing Venue

The Prior Art Database


An Endpoint Analytics Processor is disclosed for providing efficient endpoint analytics services.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 30% of the total text.

Page 01 of 5

An Endpoint Analytics Processor for Providing Efficient Endpoint Analytics Services

Disclosed is an Endpoint Analytics Processor (EAP) for providing efficient endpoint analytics services.

For analyzing processing of the EAP, programs as transformation agents that provide processing are explored. Here, the processing is performed for data that is transferred from one endpoint to another, such as, from a source to sink. Various languages exist that allow description and compilation of common operators on streams . Edge adapters are provided to convert data sources and sinks into formats of the stream . Data processing primitives are aggregated into processing elements in an efficient manner. While analytics for stream processing are well researched , more forms of such processing and analytics are increasingly being developed in the commodity form of mashups. Application Programming Interface (APIs) and open source languages to support the processing and analytics are increasingly available, and numerous companies enable them to be run in clouds that enable support for such processing .

The languages and the open source support for creating data centric software on the fly are increasingly available for end-users incorporating concepts similar to stream processing languages but ones that operate at taxonomy of APIs and endpoints are limited. It is expected that the trend of endpoint to endpoint programming with operators for data manipulation continues to increase in importance . In addition, newer trends in innovations include the ability to call analytics computation APIs within these programs and not just basic data manipulation operations but complex analytics with endpoint streams and databases as sources and sinks . These APIs take the form of calling APIs to compute which may be run on a separate cloud such as those from an analytics API. The APIs also take the form of VMs for rent , such as, software as a service, which may run not only on the cloud providing the SaaS, but may be run anywhere. The programming model therefore should enable calls to endpoint computation APIs, as well as enable rent-an analytics-VM run-where-you-want model.

An example end point analytics code is given as: In stream X =relevant social account endpoint /* here streams of input are obtained from the social account endpoint */

In stream Y = relevant weather endpoint

Var DB1 <- X /* this is their input of the social account data into a database */

Var DB2 <- Y;

Out Stream CommonPatterns = anAnalyticsAPI endpoint (DB1, DB2, patternfunction)

In this example, the AnalyticsAPI is used or VMs (VM1, VM2):data(DB1, DB2, patternfunction) are dispatched. Here, analytics VMs VM1 and VM2, on data DB1 and DB2 are also dispatched. The language can run either on a cloud platform for stream processing, a hardware platform that is suitably enhanced to support runtimes for the


Page 02 of 5

languages and a hybrid of the cloud platforms. Here, the concept is to hav...