Making the human body a command center


ROCnROLL 

Subject: Proposal for the JOC & Remote Video Collaboration Objective:

Advance the concept of the Joint Operations Center to include a Remote Operational Core Network with Real-time Optical Link (ROC’nROL) for access, viewing and collaboration between key personnel and decision-makers. One solution is to review the functionality and feasibility of a mobile display device such as a visor. Visor (Virtual Head Mounted Displays) technology is becoming more capable, robust, and providing increasing options by using common computer interface concepts. Additional capabilities desired or required are dependent on the data types and mobility trade-off, not necessarily the technology limitations.

Concept of Operations:

Design a core network based on flexible, interoperable technologies that is ‘positionally’ independent of a specific physical location. Use current COTS technology, existing infrastructure, human factor integration, and open systems/common architecture to develop a real-time link to the Situational Awareness Wall in the Joint Operations Center.

Implementation Plan:

COMTHIRDFLT’s J9 currently has several innovations and experiments that ideally lend themselves to be integrated with the ROC’nROL philosophy. These technologies are being installed onto the Sea Based Battle Lab in the very near future. A phased approach for implementation would provide logical and cooperative assessment for each technology, while still providing specific evaluation of the ROC’nROL concept.

Details:

The J9 approach is to tie the compatible technologies of current computer advances, visor (Virtual Head Mounted Displays) technology, wireless LAN, and Blue Fiber Optic Network into one integrated package. The prototype version would initially utilized the ‘unclassified aspects’ of the JOC, and the commercial visualization technologies to provide a scalable solution in three phases resulting in 3 types of end-user scenarios/capabilities. Current Programs include:

· The DARPA Sponsored Wireless LAN- is due to be installed this month. The interface with wireless to visor uses current technology and can be integrated immediately.

· The NAVAIR and DARPA Sponsored ‘Blue Fiber’ backbone- program is planned to be installed in the next 2-3 months. This additional (initially unclassified) Fiber Optic Link will provide high volume, high-speed throughput into vital areas associated with the operational conduct of the Battle Watch.

· I3 [pronounced eye-cubed] is a IDIQ Program that has a contract to develop human factor integration of computer technology and command center capabilities *. Their leading edge concepts have provided key innovations that advance voice, visual and non-conventional tactile feeds into the latest computer technologies. [*This technology is also highly targeted toward the handicapped community where research into physical limitation can produce extraordinary advances in bio-metrics, robotics as well as visor and alternative input devices!]

Phase I. Stand-alone Hardware review and assessments

a- Wireless LAN: establish the Stand-alone LAN and evaluate area of coverage and issues regarding network, computer capabilities, and throughput. [This phase demonstrates the WLAN as a shipboard option.] b- Visor Technology and processing capabilities: Procure at least 4 wearable processors / computers and evaluate their serviceability, wear-ability, and capabilities. Procure at least 8 Visors or Virtual Head Mounted Display devices for assessment. Determine the ‘best of breed’ based on the three end-user scenarios defined. [Find at least 2 video display visors, and 6 visors with processor integrated display systems.] c- Review software and software development requirements and human-factor integration issues.

Phase II. Integrate Hardware/Software information into Human Factors

Define three end-user scenarios: a. View and Discuss: Video out to Roaming site, two-way voice back to Operations center. (VoVxb) b. View, Discuss and Annotate: two-way video feed and collaborative response from remote site. c. Full Mobile functions: full two-way connectivity, voice, and collaboration capabilities. Demonstrate Usability: Stress test hardware/software integration, and provide responsive and adaptive assessments of the equipment and features. Evaluate and assess the end-user’s responses to the technology, the capabilities, and the ‘wear-ability’ of the devices. Continuous software development / modification to enable better integration of technology to fit the human –machine interfaces.

Phase III. Additional Human Factor Assessments, Situation Options and Tool Package.

Continue with Hardware assessment for additional flexibility and robustness in environment. Continue with Software development for toolbox of collaborative elements to include: · Multi-point voice communications of IP · Multi-cast video display and collaboration over IP · Hardening of visual devices: waterproof, shatterproof, etc. · Integration with shared common environments: GPS, Video cameras, future blue-tooth devices, etc.

Final Products:

· 4 Roaming Workstation with Full bi-directional processor/computer capabilities
· 4 Roaming Headsets with View / collaborative voice capabilities.
· Shared Common Operational View and work environment

 Back up to Top