submission voting
voting is closed.
QEBELS Network.
short description
QEBELS Network: Quantum Enabled Blockchain Encrypted Laser Satellite Network, Inspired by nature.


Introduce yourself or your team
We are Alejandro Bollana & Charly Karamanian. Winners of Herox NASA SpacePoop Challenge.

Alejandro Bollana is a very passionate and gifted Industrial Designer. He works for the Ministry of Modernization, Innovation and Technology of Buenos Aires City Government, where he met Charly and became open innovation challenges teammates.

Charly Karamanian is a simple man with huge concerns. Finding simple solutions to complex challenges is his drive in life. Passionate about solving global grand challenges: energy, environment, food, shelter, space, water, disaster resilience, governance, health, learning, prosperity & security. He excels in leading and mentoring of individuals and multidisciplinary teams. Creative thinker, serial entrepreneur and TEDx speaker. He developed and moved in with his family to "La Casa G", one of the most sustainable houses in Latin America and the first one to mine SolarCoin in Argentina. He believes a better world is possible and that everyone has the potential to change his reality.
What makes you an ideal candidate for this Challenge?
Alejandro is an Industrial designer proven experience in bringing ideas & concepts into life. Charly has a broad base of general knowledge (many science & art fields) with a deep understanding of exponential technologies. He is one of the main references of sustainable innovation in Argentina and an experienced open innovation challenge hacker.


Describe your solution.
The proposed solution was developed to provide maximum connectivity, security, seamless user experience and support to USSOCOM missions. To achieve these requirements, we made an extensive research looking for breakthrough technologies that could contribute to the proposed challenge and designed a 360 solution that goes way beyond a single CubeSat design; we found inspiration in nature (hummingbird, eagle, swarm of bees, scorpion and dragonfly) and worked around the following technologies: Laser relay GEO satellite, Swarm of CubeSats, Blockchain Technology, LaserCom, Integrated Photonics, Water Propulsion, Drones, Compound Optics, Augmented Reality, Iris Scanner, Bone Conduction and Quantum Encryption.

Please see attached .pdf file for full description and images of the solution.
What is the size of your proposed solution?
Proposed solution is 1.5- unit (1.5U) CubeSats. Each CubeSat should be about 4 inches x 4 inches x 6.7 inches (10 centimeters x 10 centimeters x 17 centimeters) and weighs approximately 5 pounds (2.5 kilograms).
Does your solution help Special Operations Forces missions? How?
Secure communications & high data rate on the move (wearable)
Helps on missions: find, fix, track, target, engage, assess.
Digitally track friendly (blue) & enemy (red) people and locations.
Display 3D Battlefields, live video from drones, etc.
Where known, identify platform accommodation requirements for power.
Same as OCSD project.
Where known, identify platform accommodation requirements for thermal control.
Same as OCSD project.
Where known, identify platform accommodation requirements for data transfer rate.
Same as OCSD project.
Where known, identify platform accommodation requirements for data transfer volume (per orbit).
Same as OCSD project.
Where known, identify platform accommodation requirements for bus stability and attitude control.
Same as OCSD project.
Can you identify any additional platform accommodation requirements for your solution?
Can your concept can be implemented with current state-of-the-art flight-qualified components, or will it require additional development? Please describe.

Existing technologies with only minor integration or adaptation:
+ Laser Relay GEO Satellite.
+ Blockchain Technology.
+ Lasercom.
+ Augmented reality.
+ Iris recognition.
+ Bone sound conduction.

Existing technologies under development (should be ready within required challenge 12-24 months schedule)
+ Water/Ice propulsion.
+ Integrated photonics - compound optics.

Technologies under development
+ Quantum key encryption.
Intellectual Property: Do you acknowledge that this is only the Concept Phase of the competition, and all ideas are to remain the property and ownership of USSOCOM for future discretionary use, licensing, or inclusion in future challenges?

comments (public)

  • Jonathan Gael Dec. 15, 2017, 8:47 a.m. PST
    If eliminating contention at the bottom of layer 2 would provide a benefit for the optical Phy, then I'm happy to share DQ switching with you for the project.
  • Charly Karamanian Dec. 13, 2017, 10:26 a.m. PST
    QEBELS Network technologies.
  • Charly Karamanian Dec. 13, 2017, 10:25 a.m. PST
    RAscue version of the helmet.
  • Charly Karamanian Dec. 13, 2017, 10:22 a.m. PST
    QEBELS Network solution stands out in both civilian and military (humanitarian and rescue) missions. For example, a rescue team working in an area affected by an incident can receive all kinds of vital information in real time from the CubeSats or drones directly to their helmets and can see through augmented reality (within their visual field) a three-dimensional map of the area , including the silhouettes of people trapped under the rubble (taken by an infrared camera from a drone), the location of their companions and all kinds of information related to climatic, environmental and coordination variables of the rescue mission.
  • Hugo Shelley Nov. 19, 2017, 5:55 a.m. PST
    Congratulations on getting so many votes! I read through your solution yesterday to try to understand how it works, but got stuck on something fundamental - maybe you can help.

    Ambient infrared light in the sky will appear as noise in your signal. A telescope cuts out most of this noise by precisely tracking the satellite and collecting light from a very tiny patch of sky around it. On a clear day, the signal to noise ratio will be about 10:1 (data from the OCSD project)

    Unlike a telescope your helmet collects light from the entire sky, so there’s a lot more ambient noise. Even if we cut out the light from the ‘eyes’ that aren’t facing the satellite, the signal to noise ratio will be around 1:100,000 and the data will be lost in a sea of ambient infrared!

    An omnidirectional receiver like this would be a breakthrough for lasercomm communications in general. So perhaps I’ve missed something about the way the elements work, or the system as a whole. Would love to know more!
    • Hugo Shelley Nov. 19, 2017, 10:35 a.m. PST
      Thanks for the reply!

      So if I understand correctly, each hexagonal section of the helmet is composed of many tiny lenses and photocells. Each hexagon covers about 10 degrees of sky, and that’s divided up between the photocells, each covering an estimated 0.2 degrees of sky each.

      So each photocell covers roughly the same amount of sky as the telescope does, which is good news for reducing noise. Let’s assume a perfect filtering algorithm that ignores the noise from all the photocells which aren’t pointed at the satellite (as they’re just picking up stray infrared) and listen up the signal from the one that is.

      However, at any one time, only one photocell in your helmet will be pointed at the satellite. And unlike the telescope, that photocell is pretty small! If we assume each one is 1mm in diameter then the signal will be 100,000 times smaller than the signal from the telescope. So although you’ve reduced your noise level, you’ve also reduced your signal - and the ratio between the two remains the same.

      Having said this I agree with you that lasercom has huge potential. And there’s definitely compatibility between both our solutions - a hybrid laser/UHF beamsteering system would be an interesting way forward, especially if you’re looking to form radio links from LEO to a moving aircraft.

      The Celestron 8” looks like a great telescope. I’d love a Canon 400mm f/2.8 for astrophotography, but they’re insanely expensive. Apparently Univ. Toronto can afford 48 of them!

      (The lenses here are all pointed in the same direction of course, but it’s still a really interesting read, especially with your interest in compound optics!)
    • Charly Karamanian Nov. 19, 2017, 12:53 p.m. PST
    • Charly Karamanian Nov. 19, 2017, 12:55 p.m. PST
      Hi again, maybe the render of the helmet is misleading. We draw a narrow laser beam to illustrate we are using laser technology, but in real life, a laser beam shot from a LEO Satellite or even a drone would be several meters wide when reaching ground level. Given the convex shape of the helmet, approx. 1/3 of its surface would be hit by the laser beam all the time. Of course not all photonics would get the same signal strength. The ones at the center of the hit area will get a stronger perpendicular beam. As the photonics get further from the center, the signal would be weaker. Please note that there are lenses over each photonic integrated circuit, so they get a reading even if the laser does not hit them perpendicularly. The proper approach is not to filter the weak readings but use them for enhancing noise filtering and providing data to a tracking algorithm. It’s a similar concept as your 4-element antenna or MIMO technology on a Wi-Fi router. This lets us get a (laser) precise location in the sky of the CubeSat or drone, so the emitter gimbal can track them. Thanks for the recommended reading!
    • Charly Karamanian Nov. 19, 2017, 1:27 p.m. PST
      I meant at least 2/3 of the helmet would be hit by the laser.
    • Hugo Shelley Nov. 19, 2017, 3:08 p.m. PST
      Yes, I’m assuming that the beam will be at least a few km wide, for the sake of the satellite’s positioning system if nothing else!

      If you wish 2/3 of the photocells to capture light from the satellite (as opposed to just being hit by that light) then each photocell must have a very wide field of view. In this scenario you have high signal but also high noise. Alternatively each photocell can have a very narrow field of view, resulting in low signal and low noise.

      I’m imagining your solution as an array of tiny lenslets (such a cute name) with a photocell at the focus of each one. Similar to the OCDS ground station, but with multiple lenses rather than one big mirror.

      However, if you’re looking to adopt the SPIDER technology and use interferometry rather than simply counting photons, then it’s a different kind of challenge. At first glance I can see that SPIDER’s lenslets have a maximum 0.35 mrad (0.02 degree) field of view - my worry would be that modifying the optics to accept light from all angles would make it impossible to mathematically reconstruct the image from the interference pattern.

      But this is where I should leave the conversation, as wide-angle interferometry across a curved surface is beyond what I’m qualified to comment on ; ) Best of luck in the rest of the competition!
  • Catherine J Fox Nov. 16, 2017, 10:17 p.m. PST
    genios ;)
  • Carlos Gonzalez Manco Nov. 16, 2017, 12:17 p.m. PST
    mucha suerte, el uso es original
  • Zelm Aerospace Nov. 16, 2017, 10:22 a.m. PST
    You didn't win the Space Poop Challenge.
    • Charly Karamanian Nov. 16, 2017, 10:51 a.m. PST
      Dear Dave, Space Poop Challenge had 24 winners. We are one of them. Only the first three got the money price. All our 24 submissions were reviewed by NASA and some of them will eventually be used to develop the new generation of space suits. Regards, Charly K.
    • Zelm Aerospace Nov. 16, 2017, 11:05 a.m. PST
      My Mistake, I misunderstood your statement. Good luck!
    • Charly Karamanian Nov. 19, 2017, 9:28 a.m. PST
      Sorry for the typo: money prize (not price) ;)
  • maria elena Nov. 15, 2017, 7:58 p.m. PST
    Mucha suerte !!! Ojalá ganen .Felicitaciones !!
  • Jeam Alfaro Nov. 15, 2017, 3:20 p.m. PST
    Buena suerte compañeros