InnovationQ and the IP.com Prior Art Database will be updated on Sunday, December 15, from 11am-2pm ET. You may experience brief service interruptions during that time.
Browse Prior Art Database

Method and system for intelligent function analysis and verification on CIM based devices

IP.com Disclosure Number: IPCOM000172990D
Original Publication Date: 2008-Jul-24
Included in the Prior Art Database: 2008-Jul-24

Publishing Venue



In the device management industry, the Common Information Model (CIM) has been accepted as an industry standard model by Distributed Management Task Force (DMTF). In a CIM implementation, the CIM Object Manager (CIMOM) manages the objects to be managed through the CIM protocol among a CIM Agent and a CIM Client. In a typical CIM based storage management software architecture, the CIM agent is implemented in the lower level hardware devices, and the CIM client is implemented in the higher level management software. Therefore, CIM agent becomes the management interface to devices. So the test through CIM protocol on device becomes very essential. However, to verify the functions on CIM based devices is very difficult, the main problems are: 1.When verifying function on device, the current criteria to decide whether a function is successful or not only depends on the return code (success information or failure information) of function execution, but that's always incorrect, because the return code always shows it successful, while the execution details are incorrect. Also it's impossible for people to know the execution detailes of function execution on devices, e.g. which component instance data is generated, or which attribute of component is changed, … 2.A bigger problem is: for some long running tasks, the function will return after submiting jobs to device, but whether the final execution is successful or not depends on the real execution result on devices. 3.People usually needs to manually write programming codes for test cases to verify functions of devices, which requires more specific skills and takes a long time to do, so the cost is very high. 4.When executing one function on devices, it’s impossible to find out the execution time or performance data for each sub-execution steps on that devices; for example: CreateReplica() will firstly create a volume, and then copy data from source to destination; But it's impossible for user to know the execution performance of volume creation sub step and data copy sub step. 5.People can write very specific checking codes to verify specific type of device, but these codes can always focus on one type of device, when changes to another type, original codes or program always doesn’t work. Also, writing that kind of specific checking codes requires very deep skill or knowledge on that device. This paper introduce the verification process in the following way: 1.Based on the Function Verification Plan, users run their test cases on the Function Test Engine. This involves any standard CIM requests and extrinsic methods from the target devices tested in this plan. While the users are running the test cases, actual implementation in CIM Agent will be executed, which are management operations over the CIM Device. 2.The Monitor Engine works on the other side of the CIM Agent. It plugs itself to the CIMOM server and starts the series of monitoring work automatically. Any operation happened while CIM Agent’s implementation is being executed. The Monitor Engine has a record stored to a local repository. These records will be the input to the Analysis Engine in the next step. Inside the Monitor Engine, the presented invention features a Function Monitor, a Performance Monitor and an Indication Registration sub-parts. The three sub-parts will generate data as in their specific area. 3.The Analysis Engine collects the data generated from Monitor Engine module. Using predefined standard rule in the inventions Verification Rule Base and other customized rule corresponding to any test scenario in the test plan input by the users, Analysis Engine gives out the verdict on the test results. Details in the test results represent the verification on the components implemented by CIM Agent to manage CIM Devices. Based on the modules in Monitor Engine and Analysis Engine, component instances, attributes of the component instances, operations conducted on the components and any events related to the components are included in the results. The users don’t have to do manual verification, because the rule-based verification mechanism in the invention has an intelligent engine to do this.