The IntelliView Vision System (IVS), an IIoT analytic imaging and alerting platform, is the product of over 300,000 hours of rigorous customer driven field testing and ongoing refinement. The IVS utilizes the combined capabilities of patented and proprietary software, supervised machine learning and deep neural networks together with distributed processing and the latest sensors.
The convergence of these technologies brought about the development of solutions that deliver enhanced detection accuracy and high analytic performance at low false positive outcomes. Another key leverage of the IVS is its flexibility in solving the unique, mission critical problems of today’s industries, such as above-ground liquid leak detection, remote equipment inspection and perimeter surveillance for Midstream Oil and Gas and Mining companies.
Analytic Software Capabilities:
- Live Bi-Spectral Sensor Input Analysis – The IntelliView analytic software is applied in real time to the thermal and the full HD color video feeds of the DCAM™, doubling the system’s detection capability. The FLIR Long Wave IR sensor (uncooled microbolometer) tracks emitted energy without the requirement for lighting and allows the analytics to perform well even in adverse weather.
- Dual-Sensor Detection Correlation – An event is analyzed and matched individually by the DCAM’s thermal sensor and color sensor. This approach introduces an additional layer of event qualification at the network edge, which contributes to false alerts reduction.
- Multi-Data Processing – Various types of data (e.g. radiometric, environmental, user input, etc.) are analyzed and presented at the System Console as information that can be acted upon by the monitoring personnel and first responders.
- Multi-Region Detection – Independent analytic rules can be implemented at different areas of a camera view to provide comprehensive coverage, especially for complex and unconventional sites.
- Analytic Rule Coupling – Two analytic conditions from either or both sensors of a DCAM can be merged to work in tandem, which improves object detection and adds a layer of validation.
- Analytic Control of Digital I/O Devices – Peripheral devices can be remotely activated and deactivated by analytic rules.
- Object Characteristics Specification – The analytics can detect objects of interest based on their unique properties (e.g. size, temperature, speed, color), thereby disqualifying objects which fall outside of these parameters.
- Object Validation by Classification – Deep learning artificial intelligence provides an extra layer of object qualification on the basis of its type (e.g. people, car, animal).
- Low False Alert Levels – The combination of hardware and software technologies deployed in an IntelliView system enables the analytics to perform optimally and accurately in a wide range of conditions, including extreme temperature (hot and cold climates), indoors, outdoors, hazardous environments.
- Environmental Filtering – The analytics mitigates the impacts of weather and ambient elements (e.g. glare, shadowing, heavy rain, snow, and fog), which are common causes of false positives.
- Image Stabilization (IntelliView proprietary software) – Image distortion from camera shake, typically caused by strong wind, is automatically corrected to help maintain optimal analytic function.
- Detection Sensitivity Control – Detection levels can be adjusted to suit a specific environment and help meet industry accepted false positive rates.
Implementation of IntelliView Analytic Software Technologies Within an Industrial IoT Architecture
1. Onsite (Network Edge):
IntelliView’s image processing technologies, built into the DCAM™, process thermal (LWIR) and color video feeds in real time. The system memorizes the background and adjusts to background changes. When an event that meets user-specified conditions is detected, this is reported with image and video to the System Console, typically located in the customer control or monitoring center.
2. At The Monitoring/Control Centre
The System Console (SC) has the ability to further evaluate an alarm event using supervised machine learning, employing deep neural nets akin to the systems used in self-driving cars. The incorporation of adaptive artificial intelligence trained on pre-classified data adds a layer of improved recognition and processing of detected events to the analytics suite.
- Event Only Live Feed Streaming or Continuous Live Feed Streaming Option (multiple screens)
- Analytic Rules Management Tools (setup, scheduling, automation, coupling, etc.)
- Multiple Analytic Regions Per Camera Scene
- Event Detection Register with Photo/Video
- System Events/Status Register (reports on user actions, system errors, device conditions, etc.)
- Comprehensive and Customizable Alert Package
- Camera Health Check (indicates status, such as online, offline, disconnected, tampering, etc.)
- Event Notification Email Forwarding Option (with photo/video)
- PTZ Camera Settings and Controls
- Video Playback and Review Tools
- H.264 Digital Video Recording (local and remote storage)
- Photo and Video Exporting (MP4 or JPEG format)
- Secure Web User Interface (anytime, anywhere access via desktop computers, laptops and selected mobile devices)
- Advanced settings: analytic overlay, Modbus reporting, video summary, on-screen display (OSD), and more.