submission voting
voting is closed.
Virtual Environment for Spaceship & multipurpose
short description
Physical room for muti purposes virtual test environment

About you


Technical Details

Solution overview
- 1.1.The physical measurement environment is a cube liquid tank that surrounded by 5 faces (or 6 faces if capped),

- 1.2.All sides are perfect reflective mirrors

- 1.3.Each surface of the tank is fitted with grids of light spot

- 1.4.The distance between each light point is a unit of length

- 1.5.Light spots are the reference points for the virtual space based on that, to reproduce the virtual model fitting onto the physical environment

- 1.6.With 4-6 mirrors, the light points will be reflected infinitely in three dimensions of space, creating a unlimited system of light points grid

- 1.7.The reason of unlimited reference points: to create a rendering method different from the existing VR devices, that is based on *Real Perspective Distortion* (RPD) :
- 1.7.1.The natural vision of eyes is based on the principle of perspective: the image on retina is the result of the point convergence projection
- 1.7.2.The consequence of convergence projection is that the image of everything is distorted, for example, two parallel lines in reality but in the image of the eye are two lines converging at the horizon
- 1.7.3. Propositional Problem (PP): Known all dimensions of object in space (DOS), known the distance and angle of the eye to object (DAE), using convergence projection can infer the distortion of object's perspective image (DOI) that eye will receive. Inverse Problem (IP): known DOI and DOS, can infer DAE
- 1.7.4.The existing VR devices apply the PP. Tracking basement will detect the DAE, the virtual DOS is known, and the computer will generate the DOI without regard to the actual scene in front of the eyes have how perspective point.
- 1.7.5. This submission uses the IP and PP at same time: the DOI is known by the photo of infinity light points grid that captured in realtime by camera on headset, the DOS of light poinst grid is constant (depends on the distance between the bright spots as well as between the mirrors), cooperate with the The real time laser tracking system (item 2.), the computer infers the DAE (called the IP process). And then the Virtual Reality environment will be generated from that DAE (called the PP process).
- 1.7.6.For the tests of space environment simulations, the change of view's angle is slow, (caused by the space suit- item 4.). In case of simulating other living situations (like in hotel, in car, in public...), the change of view's angle is fast. Ubiquity of the distortion reference points is the foundation to create the nearly real-life vision for VR device

- 1.8.The difference between existing VR devices and this method : The existing VR device based on the indirect result of tracking measurement, may lag in fast motion, but it can be applied with any room and commercialized. While this method based on the fact of scene in realtime motion, enhances real and limits lagging, but unpopularized

- 2.1.The system of laser beams and sensors.

- 2.2.This system aims to accurately scan the volume model and movement of the body in real time

- 2.3.Accurately reconstructing the volume and motion of first responder into virtual environment.

- 2.4,Accurating simulation of a human entity in a virtual environment. Example, can first respondercan actually go through a hole or get stuck? If there is no accurate volume & posture model of first responder, it's impossible to calculate for such situation

- 2.5.Unlike 3d sensor scans for static objects. This system must capture all real-time happenings of 3d model, as well as the skeletal movement (likely kinect) of first responder

- 2.6.It can be arranged as the matrix of sensors Or as a basestation sweeping 360°

- 2.7.Need at least one pair of two parallel matrix planes facing each other and automatically scan first responder in realtime.

- 3.1. The transparent liquid will be filled the room.

- 3.2. The density of liquid must be enough to lift the body in suit floating in the space of the room, likely the pool to prepare astronauts for the experience of micro weight

- 3.3. Creates an environment for the simulation of the impact forces (item 4.2)

-3.4. In case of simulating any situations on earth ( hotel, house, public...) , there is no need for liquid

- 4.1.Likely the Advanced Crew Escape Suit

- 4.2.The Interactive force emulator: The system creates streaming for suction force and thrust force to simulate the impact, including :
- 4.2.1.The holes on the surface of the suit, placed on various positions over the suit and have own gyroscope sensors to dectect the direction
- 4.2.2.The hoses and tubes connect the holes and pumps. The hoses and tube are hidden under the fabric

====== Due to characters limit, be continued in "DIY Recipe Kit " section======
DIY Recipe Kit
=====be continued from the previous section====

- 4.2.3.The pumps create the impact on first responder by hydraulic suction and thrust force.
- 4.2.4.For example, when the first responders push an objects in a virtual environment, this system pumps the streaming of thrust to simulate the inertial force that affects the first responder (even simulate the hardness of the material) . When pulling an object, the streaming of suction would be created to simulate the feeling of being pulled back
- 4.2.5.In case of simulating any situations on earth ( hotel, house, public...), can use air pressure for impact simulation
- 4.2.6. This system also assumes the task of automatically taking first responder to the middle of the room when moving too far

- 4.3.The Vibration system:
- 4.3.1.Emulating the sensation of temperature. Hot or cold temperatures of varying degrees (that harms the human) will be warned by different vibration modes.
- 4.3.2.Placed on various positions over the suit, hidden under the fabric, that contact with skin.

- 4.4.Fingers tracking: Each finger of gloves contains sensors that track its movement likely existing VR devices. The control system is incorporated into the gloves

- 4.5.Auxiliary system:
- 4.5.1.the sound system: communication, warning, simulating
-4.5.2.the hose to hold the spacesuit: Power; Data transfer; hose for supplying and exhausting air ( The air is not released directly into the liquid environment because the air bubbles will refract the light, so the DOI will be inaccurate), the hose will be scanned by laser tracking, may be ignored in virtual environment

-4.6.In case of simulating any situations on earth ( hotel, house, public...) The suit can be modified accordingly

- 5.1 One of problems when using RPD: the real distortion is captured by camera must be the image that eyes also see. As usual, the camera must be in the eye.
- 5.1.1.To solve this problem, it is possible to use a prism (like the submarine's glass)
- 5.1.2.The prism will be placed in front of eye,the camera perpendicular to the axis of the eye, distance from camera to prism is distant from eye to prism. So the camera will receive the image that eye will receive, but it can not capture the image of rolling eyeballs
- 5.1.3. Need to detect the movement of pupil when rolling eyes.
- 5.1.4. The technology on the Autorefractor can solve this problem. Not only that, It also provides the ability to detect the focusing of eye, that makes the vision of VR device nearly natural eyes, an example is eyes tracking technology very promising in FOVE's headset

- 5.2. When rolling eyes, can not capture the DOI, item 5.1.1 just captures the DOI when the pupil at straight view. Item 5.1.4 can detect the pupil, so the computers will calculate the position & angle of eyes and execute PP process

- 5.3. Some other supporting technologies
- 5.3.1.Interpolation camera
- 5.3.2.Distance tracking for camera, gyroscope, accelerometer, magnetometer....

- 6.1 the Intermediate objects (TIO): The physical objects that the system can understand and put into a virtual environment:
- 6.1.1.They are non-functional real models, when put into the test room, they will be recognized by the system and reconstructed in virtual environment as functional objects.
- 6.1.2.For example, a cylinder model can be simulated to become a fire extinguisher in virtual space and can extinguish the fire in this environment.
- 6.1.3. The functional or mechanical TIO can be simulated normally, but the best is that they only have automatic or mechanical movement feature. Other features like thermodynamics, chemistry, biology ...ect should be virtualized

- 6.2.Automatic airbag protection system: air bag inflates automatically when a problem occurs, push the first responder out the water. In case of simulating any situations on earth ( hotel, house, public...)The system can protect first responder from falling or colliding with the wall

=====End of the "Solution overview"=====

DIY Recipe Kit
7.The room
Mirror wall panels can be separated into many parts and transported to the recreation place. Each module of the wall panel consists of a mirror with light spots & laser detectors on surface, the electronic control behind the mirror must be covered in a waterproof case.
Each module can be fitted to a stainless steel frame forming a wall. Walls would be combined into a complete room, assembled in place or crane into the pool

8.The spacesuit:
The space between the fabric layers allows the placement of the hoses inside spacesuit. A scuba with pumps that maintains the pressure is ready for both suction force and thrust force, includes a vacuum vessel maintains negative pressure and a vessel maintains positive pressure.

====== Due to characters limit, be continued in "Safety" section======
=====be continued from the previous section====

The holes on suit's fabric are made of stainless steel and have own gyroscope sensors to dectect the direction , arranged on the surface of the suit at the suitable position, one hole can receive both of two types of pressure. To control the streaming for suction or thrust, there are the valves that determine they would open with vacuum vessel or positive vessel and how much strength of the force is correct

9. The Headset:
The headset is integrated with both available and unavailable technology.
9.1Available technology in headset :
- Glasses and rendering images technology
- gyroscope, accelerometer, magnetometer...
- Eyes tracking 5.1.4 (like FOVE's headset)
- Camera for RPD-item1.7 (can use camera Go Pro)

10. The supporting system:
- Skeletal tracking (kinect)
- Camera dêtcts the Intermediate objects (TIO₫- item 6.1(Augmented Reality)

11. Process of installation and setup
11.1. The data of weight (total weight, local weight ) and the model of first responder (when not wearing spacesuit) will be collected
11.2. When being in pool, the first responder in suit would be scanned by RLT for collecting 3d model, all particulars wil be modeled. Depending on the situation in the virtual environment. The weight of first responder's body, or the weight of both body and suit, that would be calculated for physical impact. If the virtual situation that first responder is not wearing suit , the weight is only calculated for body.
11.3.The position of first responder is also collected in realtime by RLT
11.4. The first responder moves limbs for activating the skeletal tracking.The position of first responder is also collected in realtime
11.5. The camera RPD is activated. Previously, the system would have prepared a virtual grid of light dots in virtual environment that had the same dimensions as the grid of dots in the mirror. And when Camera detected the RPD, the IP process is done, the PP process will display the scene of virtual grid points. Then, the Headset will display the real scene that first responder are looking. System will turn off the virtual scene and turn on real scene ,then turn of real scene and turn on virtual scene randomly and continuously, that in order to help first responder compare the differences between the real and virtual grid of points. If there is no difference, the setup is done.
11.6.The installation of Intermediate objects.
Some scenes require objects for physical interaction, such as car rescues, carrying patients...It is easy to display these entities in a virtual environment but to get the right feeling for action, need the real objects which are the same size or weight to take charge of making feeling, the remaining task is the image of the object would be virtualized. An typical example is cinematic effects, green models or green actor with indicator symbols will be converted into the new characters
This setup step is to place the real models that have same characteristics as virtual objects, and the system virtualizes them in realtime
=====end of "DIY Recipe Kit" section=====

12.There are two types of virtual environments that can be applied to this solution
12.1.Type 1: Micro Gravity, Water Environment, flight.. . This type need water filled in physical room, due to the ability to simulate the impact by hydraulic (4.2)
Example: in spaceship, in space, under the sea, flight (floating on air likely helicopter, not in the plane)
12.2.Type 2: Environment of Life on land. This type need no water in physical room. Example: in House, hotel, public, car...
Both types use FTS for simulating impact force
12.3.All these forces are within the allowable limits, The forces calculated by computers that exceed the Human endurance will be maximum in allowable limits and the screen will turn red to alarm.
Pressure vessel are located far away from first responder , may be hung above and connect to the first responder with a hose, automatically move with his/her movement

12.4.The safety of first responder also depends on how to handle emergency situations. The automatic airbag will undertake this task.
In underwater environment, in case of emergency the airbag automatically inflates and pushes the first responder upward out of water
In terrestrial environment, when there is the possibility of impact on the wall or floor, airbags do the same

13. Airbags also provide the ability for terrain simulation basically. When stepping on rugged terain in virtual environment, airbags under the feet can be pumped depending on the rough of land
Realistic Elements - Virtual
14.Two main components that make realistic feeling in virtual are the headset and the force-temperature simulating (FTS)
14.1.The headset with smooth motion, high definition, true colors display in any situations is the basic for realistic features. It also shows other features of sight effect like near/far focusing, rolling eyes and some Biological characteristics such as dizziness, illusion
14.2.The forces/temperature (FTS) system simulates almost the physical effects of collision and temperature in both of under water and terrestrial environment.
14.3. The coordination of the two systems is the importance to creating a real feeling.
When hitting a virtual wall, the thrust streamings from holes will afect the hand a force that makes feeling likely be stopped. The thrust streamings must have the intensity and the right time to simulate the right feeling of hitting wall as well as the hardness of material
And although in real world, the hand has crossed the position that might be the wall (if it is real), but in virtual world ,the hand is still on the surface of the wall.
When facing a virtual fire or touching a hot material, the temperature simulators is activated, each Vibration local point will have intensity corresponding to the degree of harmfulness at that point's temperatures.
When a virtual weight object floating in space and hits the first responder, all the holes on impact side will thrust the streamings to simulate the force.
When clinging to handle the of the spaceship's wall and pulling back to float forward, the first responder will feel the forces caused by thrust streamings in his/her hand, they make him/her hands feel the force needed to pull body's weight, through the backward motion of the scene
14.4. The point of this solution is the power of super computer processors and programs

14.5. The sound effects: systems are distinguished between sounds from different directions in space, help first responders recognize where the sound comes from. The speaker system will be arranged spherically around head
Realistic Elements - Physical
15. If *the force/temperature* is the way to transfer the Virtual impact from the virtual environment to the real world then *the Intermediate object* is the way to transfer information from the real world to the virtual environment.For training and testing functions, TIO is a simple way to interact with objects that have the same characteristics as the real world but with the ability to display as virtual worlds.
- 15.1.TIO should be used for the main task only, or main subject of the experiment,since other components can already be modeled using the virtual model and FTS for simulating
- 15.2. For example, to simulate a patient, can use a green model with indicator symbol, the model would have the same size and weight as normal human.
- 15.3. When virtualized, the model will be recreated the appearance and health status. Based on the indicator symbol, system can simulate the patient may do something in place but not move ( Except for the robotic model)
- 15.4.TIO can affect the virtual environment, for example the extinguisher model can extinguish the virtual fire. But the virtual environment can not impact the TIO in real world
16. The airbags also provide the basic for terrain simulation (item.13)
Technology Testing
17. This submission offers solutions and methods around the essential basic features that can be applied to enable the testing of multiple technologies and interfaces.

17.1 Almost the testing of technologies and interfaces that meet the requirements must ensure the two-ways interaction between the real world and the virtual world. The impact from the real world on the virtual world has been made well by current technologies.

17.2.This submission complements the solution that virtual world can affect real human.
For the purpose of training and testing, three of the five human's senses are met in this solution.
It allows the transmission of the virtual impact experiment on human's reality.

17.3.Except for tests that require the feeling of taste, smell or sensation inside the body, all tests based on the factors of impact/heat can use this design.

17.4.For tests of physical equipments, they can be treated as TIO and should have mechanical feature, other features must be virtualized (item 6.1.3).

17.5.There may be two cases of physical equipment testing: without human and in context with human:
17.5.1.Without human, the testing of the equipment becomes a experience that can be virtualized entirely, the TIO when being virtualized, It can affect the virtual world or be affected by the this environment. The result of that is the process of computing and can be shown in virtual world, the first responder can feel that result in the virtual world with the support systems above, image, sound, temperature, force... and TIO still intact. This solution does not deform the TIO, if need a real model after being impacted by virtual environment , can use 3d printer.
17.5.2. In context with human : As in item 6.1.3 mentioned, except for mechanical motion features, all others features must be virtualized. And especially the impact feeling on humans also have to convert to three senses of feeling that supported above (Visual- auditory- force/vibrations simulator).
17.5.3.For example, Feeling dizzy, the headset will display a blur sight, the noise sounds in speakers , the force simulator will create pressing forces make the first responder move difficultly and fall easily. Itch feeling can use vibration system,..ect.

17.6. The interface of testing will be simulated easily, but if need more realistic feelings , likely feel the button, control bar...ect, can use TIO with indicated symbol
Purchase Order List*
18.The Headset:
The headset is integrated with both available and unavailable technology.
- Glasses and rendering images technology
- gyroscope, accelerometer, magnetometer...
- Eyes tracking 5.1.4 (like FOVE's headset)
- Camera for RPD-item1.7 (can use camera Go Pro)

19. The supporting system: placed on 8 corners of room
- Skeletal tracking (kinect)
- Camera detects the Intermediate objects- item 6.1(Augmented Reality)
- Basestation for hand tracking (HTC vive)
First Responder Scenes
20.Current technologies allow the display of wide range of scenes. Each technology has solutions to display the scene perfectly
The important thing is to correspond to wide range of scenes, the interaction also needs to be diversified according to the scene. With terrain simulation (airbags), three of senses simulation ...are aimed to do that,

21.This submission complements one method to render images more effectively, using RPD like unlimited reference points
The difference is that this method based on data from the actual situation of the eye, and infinity grid of reference points that known dimensions
It has not been yet to able to evaluate whether this method will promise or not , whether there is a new rendering algorithm based on endless reference points?
However, this method demonstrates the nature principle of the scene in space

Second supplement is TIO.
Scenes can be complemented with real components that are customizable to both interact well and be true to the display. Example, to describe a corner of the cliff, it is possible to use the TIO as a highl frame and be virtualized to create a context for the situation.
The wide range of scenes that this solution accommodates : micro gravity, under water, ocean, terrestrial scene, big scene, close up
22.There are two processes of data/metrics collection: serving for science and serving for visual
22.1.Collection process for science: geometric and physical measurement.
-Geometric: Using laser scanning, skeletal tracking, basestation ..., to collect static data and dynamic data.
+Static data: the model of first responder, the model of TIO & indicator symbols...
+Dynamic data: the positions & actions of first responder's model, situation of TIO, skeletal movement...
-Physic: the total and local mass of body. Use usual methods to measure (the mass of TIO is determined in virtual environment by system, the weight of TIO is to create a right feeling for first responder)

22.2.Collection process for visual data: the position, angle, movement, focusing and situation of eyes
-The position, angle and movement of eyes : infinity light points is a type of data based on fact, in collaboration with other data for visual metrics. Using Camera RPD, Distance tracking for camera, pupil tracking, gyroscope, accelerometer, magnetometer...
-The focusing of eyes: pupil tracking (like on Autorefractor)
-The situation of eyes : based on the context of simulation, the impact-temperatures in virtual environment. The system will simulate the biological situation of the eye, such as dizziness, blurred vision...
Supporting Documents - Visual Aids

comments (public)