NASA Beams “Hello, World!” Video from Space via Laser

image

NASA successfully beamed a high-definition video 260 miles from the International Space Station to Earth Thursday using a new laser communications instrument.

Transmission of “Hello, World!” as a video message was the first 175-megabit communication for the Optical Payload for Lasercomm Science (OPALS), a technology demonstration that allows NASA to test methods for communication with future spacecraft using higher bandwidth than radio waves.

“The International Space Station is a test bed for a host of technologies that are helping us increase our knowledge of how we operate in space and enable us to explore even farther into the solar system,” said Sam Scimemi, International Space Station division director at NASA Headquarters in Washington. “Using the space station to investigate ways we can improve communication rates with spacecraft beyond low-Earth orbit is another example of how the orbital complex serves as a stepping stone to human deep space exploration.”

Optical communication tools like OPALS use focused laser energy to reach data rates between 10 and 1,000 times higher than current space communications, which rely on radio portions of the electromagnetic spectrum.

Because the space station orbits Earth at 17,500 mph, transmitting data from the space station to Earth requires extremely precise targeting. The process can be equated to a person aiming a laser pointer at the end of a human hair 30 feet away and keeping it there while walking.

To achieve this extreme precision during Thursday’s demonstration, OPALS locked onto a laser beacon emitted by the Optical Communications Telescope Laboratory ground station at the Table Mountain Observatory in Wrightwood, California, and began to modulate the beam from its 2.5-watt, 1,550-nanometer laser to transmit the video. The entire transmission lasted 148 seconds and reached a maximum data transmission rate of 50 megabits per second. It took OPALS 3.5 seconds to transmit each copy of the “Hello World!” video message, which would have taken more than 10 minutes using traditional downlink methods.

“It’s incredible to see this magnificent beam of light arriving from our tiny payload on the space station,” said Matt Abrahamson, OPALS mission manager at NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California. “We look forward to experimenting with OPALS over the coming months in hopes that our findings will lead to optical communications capabilities for future deep space exploration missions.”

The OPALS Project Office is based at JPL, where the instrument was built.  OPALS arrived to the space station April 20 aboard SpaceX’s Dragon cargo spacecraft and is slated to run for a prime mission of 90 days.

View the “Hello, World!” video transmission and animation of the transmission between OPALS and the ground station, at:

http://youtu.be/1efsA8PQmDA

For more information about OPALS, visit:

http://go.nasa.gov/10MMPDO

For more information about the International Space Station, visit:

http://www.nasa.gov/station

Source: NASA

Google tech to bring 3D mapping smarts to NASA’s space station robots

image

NASA and Google are working together to send new 3D technology aloft to map the International Space Station.

Google said Thursday that its Project Tango team is collaborating with scientists at NASA’s Ames Research Center to integrate the company’s new 3D technology into a robotic platform that will work inside the space station. The integrated technology has been dubbed SPHERES, which stands for Synchronized Position Hold, Engage, Reorient, Experimental Satellites.

NASA astronaut Mike Fossum works with one of the smart Spheres aboard the International Space Station. The robotic orbs will get some 3D-sensing smarts from Google this summer. (Photo: NASA)

The technology is scheduled to launch to the orbiting station this summer, although Google a specific date hasn’t been set.

“The Spheres program aims to develop zero-gravity autonomous platforms that could act as robotic assistants for astronauts or perform maintenance activities independently on station,” according to a Google+ post from the company’s ATAP ( Advanced Technology and Projects) group. “The 3D-tracking and mapping capabilities of Project Tango would allow Spheres to reconstruct a 3D-map of the space station and, for the first time in history, enable autonomous navigation of a floating robotic platform 230 miles above the surface of the earth.”

Earlier this year, Google announced that its Project Tango group is working to build an Android phone with sensors and chips that enable it to map indoor spaces in 3D.

The project, which includes scientists from universities, research labs and commercial partners, is led by Google’s ATAP group.

“Mobile devices today assume the physical world ends at the boundaries of the screen,” said Johnny Lee, the Project Tango leader, in a YouTube video. “Our goal is to give mobile devices a human scale understanding of space and motion.”

Google’s 3D sensing smartphone, which is still in the prototype phase, has customized hardware and software, including a 4-megapixel camera, motion tracking sensors, computer vision processors and integrated depth sensing.

The sensors make more than a quarter of a million 3D measurements every second, fusing the information into a 3D map of the environment.

NASA began working with Google last summer to get Project Tango working on the space station.

The Intelligent Robotics Group at the Ames Research Center is looking to upgrade the smartphones used to power the three volleyball-sized, free-flying robots on the space station. Astronauts will exchange the current smartphones used in the Spheres with the Google prototypes.

Each robotic orb is self-contained, with power, propulsion, computing and navigation equipment, along with expansion ports for additional sensors and appendages, such as cameras and wireless power transfer systems, according to NASA.

“The Project Tango prototype incorporates a particularly important feature for the smart Spheres — a 3D sensor,” said Terry Fong, director of the Intelligent Robotics Group, in a statement. “This allows the satellites to do a better job of flying around on the space station and understanding where exactly they are.”

In February, Google and NASA scientists took the smartphone prototypes on a zero-gravity test flight. The engineers used the flight to calibrate the device’s motion-tracking and positioning code to function properly in space.

NASA scientists say they envision 3D-enabled Spheres could be used to inspect the outside of the space station or the exterior of deep space vehicles.

While Google’s 3D technology is set to go to the space station this summer, a SpaceX resupply mission, which will carry legs for the humanoid robot working on the orbiter, is slated to launch this afternoon.

SpaceX was set to launch its third resupply mission on Monday but the liftoff was postponed due to a leak in the Falcon 9 rocket that will carry the Dragon cargo spacecraft aloft.

Since the summer of 2013, Google and NASA have been working together to bring 3D mapping technology to the International Space Station.

Source: Network World

Laser Demonstration Reveals Bright Future for Space Communication

image


Dec. 23, 2013 — The completion of the 30-day Lunar Laser Communication Demonstration or LLCD mission has revealed that the possibility of expanding broadband capabilities in space using laser communications is as bright as expected.

Hosted aboard the Lunar Atmosphere and Dust Environment Explorer known as LADEE, for its ride to lunar orbit, the LLCD was designed to confirm laser communication capabilities from a distance of almost a quarter-of-a-million miles. In addition to demonstrating record-breaking data download and upload speeds to the moon at 622 megabits per second (Mbps) and 20 Mbps, respectively, LLCD also showed that it could operate as well as any NASA radio system. “Throughout our testing we did not see anything that would prevent the operational use of this technology in the immediate future,” said Don Cornwell, LLCD mission manager at NASA’s Goddard Space Flight Center in Greenbelt, Md.

For example, LLCD demonstrated error-free communications during broad daylight, including operating when the moon was to within three degrees of the sun as seen from Earth. LLCD also demonstrated error-free communications when the moon was low on the horizon, less than 4 degrees, as seen from the ground station, which also demonstrated that wind and atmospheric turbulence did not significantly impact the system. LLCD was even able to communicate through thin clouds, an unexpected bonus.

Operationally, LLCD demonstrated the ability to download data from the LADEE spacecraft itself. “We were able to download LADEE’s entire stored science and spacecraft data [1 gigabyte] in less than five minutes, which was only limited to our 40 Mbps connection to that data within LADEE” said Cornwell. Using LADEE’s onboard radio system would take several days to complete a download of the same stored data. Additionally, LLCD was to prove the integrity of laser technology to send not only error-free data but also uncorrupted commands and telemetry or monitoring messages to and from the spacecraft over the laser link.

LLCD also demonstrated the ability to “hand-off” the laser connection from one ground station to another, just as a cellphone does a hand-off from one cell tower to another. An additional achievement was the ability to operate LLCD without using LADEE’s radio at all. “We were able to program LADEE to awaken the LLCD space terminal and have it automatically point and communicate to the ground station at a specific time without radio commands. This demonstrates that this technology could serve as the primary communications system for future NASA missions,” said Cornwell.

The ability of LLCD to send and receive high definition video was proven with a message from NASA Administrator Charlie Bolden, completing the trip to the moon and back with only a few seconds of delay. “Administrator Bolden’s message demonstrates NASA’s support for advancing this technology for both space and Earth applications,” said Cornwell. “It also allowed the LLCD team to showcase the quality and fidelity of our HD video transmissions over our laser communication link to and from the moon.”

Cornwell acknowledged that the LLCD mission is another great example of NASA partnerships with outside organizations to advance unproven technologies. He credits the work of Don Boroson and his team at the Massachusetts Institute of Technology’s Lincoln Laboratory (MIT/LL) in Lexington, Mass., for developing and operating both the space and ground laser communications terminals for LLCD. “We could not have made such great strides without the work of our partners at MIT/LL,” Cornwell said. “Their years of work and knowledge produced a communications system that far exceeded our expectation.”

NASA’s follow-on mission for laser communications will be the Laser Communications Relay Demonstration (LRCD). Also managed at Goddard, LCRD will demonstrate continuous laser relay communication capabilities at over one billion bits per second between two Earth stations using a satellite in geosynchronous orbit. The system also will support communications with Earth-orbiting satellites. More importantly, LCRD will demonstrate this operational capability for as long as five years, thus building more confidence in the reliability of this laser technology.

“We are very encouraged by the results of LLCD,” said Badri Younes, NASA’s deputy associate administrator for Space Communications and Navigation (SCaN) in Washington, which sponsored the mission. “From where I sit, the future looks very bright for laser communications.”

So it appears NASA could be making the next paradigm shift in communications in the not too distant future. The same technology that has vastly upgraded our broadband connections on Earth could be expanding communications possibilities for NASA in the not-too-distant future.

Source: Science Daily

NASA Laser Communication System Sets Record with Data Transmissions to and from Moon

image

NASA’s Lunar Laser Communication Demonstration (LLCD) has made history using a pulsed laser beam to transmit data over the 239,000 miles between the moon and Earth at a record-breaking download rate of 622 megabits per second (Mbps).

LLCD is NASA’s first system for two-way communication using a laser instead of radio waves. It also has demonstrated an error-free data upload rate of 20 Mbps transmitted from the primary ground station in New Mexico to the spacecraft currently orbiting the moon.

“LLCD is the first step on our roadmap toward building the next generation of space communication capability,” said Badri Younes, NASA’s deputy associate administrator for space communications and navigation (SCaN) in Washington. “We are encouraged by the results of the demonstration to this point, and we are confident we are on the right path to introduce this new capability into operational service soon.”

Since NASA first ventured into space, it has relied on radio frequency (RF) communication. However, RF is reaching its limit as demand for more data capacity continues to increase. The development and deployment of laser communications will enable NASA to extend communication capabilities such as increased image resolution and 3-D video transmission from deep space.

“The goal of LLCD is to validate and build confidence in this technology so that future missions will consider using it,” said Don Cornwell, LLCD manager at NASA’s Goddard Space Flight Center in Greenbelt, Md. “This unique ability developed by the Massachusetts Institute of Technology’s Lincoln Laboratory has incredible application possibilities.”

LLCD is a short-duration experiment and the precursor to NASA’s long-duration demonstration, the Laser Communications Relay Demonstration (LCRD). LCRD is a part of the agency’s Technology Demonstration Missions Program, which is working to develop crosscutting technology capable of operating in the rigors of space. It is scheduled to launch in 2017.

LLCD is hosted aboard NASA’s Lunar Atmosphere and Dust Environment Explorer (LADEE), launched in September from NASA’s Wallops Flight Facility on Wallops Island, Va. LADEE is a 100-day robotic mission operated by the agency’s Ames Research Center at Moffett Field, Calif. LADEE’s mission is to provide data that will help NASA determine whether dust caused the mysterious glow astronauts observed on the lunar horizon during several Apollo missions. It also will explore the moon’s atmosphere. Ames designed, developed, built, integrated and tested LADEE, and manages overall operations of the spacecraft. NASA’s Science Mission Directorate in Washington funds the LADEE mission.

The LLCD system, flight terminal and primary ground terminal at NASA’s White Sands Test Facility in Las Cruces, N.M., were developed by the Lincoln Laboratory at MIT. The Table Mountain Optical Communications Technology Laboratory operated by NASA’s Jet Propulsion Laboratory in Pasadena, Calif., is participating in the demonstration. A third ground station operated by the European Space Agency on Tenerife in the Canary Islands also will be participating in the demonstration.

For more click the source link below.

Source: NASA

The Smallest Astronomical Satellite Ever Built Launched Today

One of the BRITE nano-satellites, as it was being assembled in Toronto

One of the BRITE nano-satellites, as it was being assembled in Toronto

At the Satish Dhawan Space Centre in Sriharikota, India this morning (Feb. 25), the smallest astronomical satellite ever built was launched into orbit aboard the Polar Satellite Launch Vehicle C20 rocket. In fact, it wasn’t just one satellite, but two – each of the twin BRIght Target Explorer (BRITE) spacecraft take the form of a cube that measures just 20 cm (7.8 inches) per side, and weighs in at under seven kilograms (15.4 lbs).

The BRITEs were designed at the Space Flight Laboratory (SFL) of the University of Toronto Institute for Aerospace Studies. One of the two nano-satellites launched today, known as UniBRITE, was assembled at SFL and funded by the University of Vienna. The other, called BRITE-Austria, was assembled in Austria and funded by that country’s Technical University of Graz – it is being promoted as “Austria’s First Satellite.”

The twin BRITE satellites aboard the PSLV-C20 rocket this Monday morning

The twin BRITE satellites aboard the PSLV-C20 rocket this Monday morning

Once in orbit, the satellites will work together to monitor changes in brightness of some of the largest, brightest stars in the sky. Their relatively small onboard telescopes limit their ability to monitor dimmer stars, or to take “pretty pictures.” Unlike ground-based telescopes that could be used to view those same stars, however, the satellites won’t be limited by scintillation – the visual distortion of celestial bodies, created by turbulence in the Earth’s atmosphere. They will also be able to image their target stars day and night, and won’t be thwarted by weather conditions such as cloud cover.

Because such large stars oscillate more slowly than smaller ones, the satellites won’t have to monitor them continuously. Instead, they can just check in on the stars at regular intervals, taking note of what changes in brightness have occurred. This, in turn, means that the satellites don’t need to be placed in one “right” orbit, where they can see their chosen star at all times. As a result, future BRITEs could hitch a ride into space on any available rocket, and placed into orbit wherever it was convenient – within reason.

Along with their telescopes, each of today’s two satellites contain three computers: one for instrument processing, one for housekeeping (keeping the satellite running), and one for attitude control. Approximately six watts of electrical power is provided by onboard solar cells.

Plans call for two other pairs of BRITE satellites to join the pair launched today, forming a “constellation” of six nano-satellites that will work together. Two of them will be Canadian, while two will be Austrian and two will be Polish.

“Big bright stars lead short and violent lives and deaths (supernovas) and in the process seed the universe with heavy elements without which life on Earth would be impossible,” the University of Toronto stated in a press release. “To better understand these stars is to better understand how life arose on our planet.”

More information on the construction of the nano-satellites is available in the video below.

Source: Gizmag

Latest Earth Satellite to Launch from California Coast

A rocket carrying an Earth observation satellite is scheduled to blast off from the California coast on a mission to keep a continuous eye on the planet’s resources.

The countdown for the Atlas V launch begins Monday morning from the Vandenberg Air Force Base along California’s central coast. The Landsat satellite is the eighth of its kind to be launched since 1972 to track glaciers, forest fires, crop production and coastlines.

Unlike its predecessors, the latest carries more powerful sensors and can return more images.

For the past four decades, the polar-orbiting Landsat satellites have documented changes to Earth’s surface including the effects of deforestation and urban sprawl.

The $855 million mission is managed by NASA and the U.S. Geological Survey.

Source: AP

Android-Powered Nexus One & Nexus S to Command Small Scale Spacecraft

NASA’s Research Center, Ames, is working on a new project designed to lower the cost of launching and operating small satellites. These specific satellites are Low Earth Orbit (LEO). This will use the Android-powered phones the Nexus One and the Nexus S to command the spacecraft.

The project is know as PhoneSat, and it will launch two different satellites into LEO orbit, both with different goals.

First there is PhoneSat 1.0 and it is based on the Nexus One. The one and only primary goal for PhoneSat 1.0 is to stay alive, meaning it is designed to test if the smartphone can operate for a reasonable amount of time while in space. The Nexus One is to use it’s camera to take pictures and send them back to Earth with other general information about the spacecraft. There will be an external radio beacon in place to indicate the satellite itself is ok and intact. This also is because if the signal is being received from the beacon and no signal is being received from the Nexus, then the problem is with the Nexus and not the spacecraft, there is also an external device that monitors the Nexus One and reboots it if the flow of data stops.

Google Nexus One

Google Nexus One

Second there is PhoneSat 2.0 which will be based on the Nexus S. Phone 2.0 will also feature additional hardware over PhoneSat 1.0. It will have solar panels so it will operates for a longer period of time, and scientist will be able to send commands to PhoneSat 2.0 because it has a two way radio. The last feature is PhoneSat 2.0 will have magnetorquer coils and reactions wheels, these are devices that will allow the satellite to orient itself and maintain proper position using electricity from the solar panels.

Google Nexus S

Google Nexus S

PhoneSat is part of a larger NASA program, the Small Spacecraft Technology Program, which has a goal to leverage the incredible technological advances in consumer technology to create cheaper spacecraft.

According to Ames engineer Chris Boshuizen “Your cellphone is really a $500 robot in your pocket that can’t get around. A lot of the real innovation now happens in entertainment and cellphone technology, and NASA should be going forward with their stuff.”

The hardware that these devices contain does make sense why they are perfect for this kind of project. They have GPS, cameras, compass, gyroscope, microphone and so on. To save weight the screens and cases will be removed and the batteries replaced with something more powerful and designed for the adventures.

Another reason why this makes sense to use Google’s Android OS is because it is open source and can be configured however NASA desires. NASA can modify the source code of the OS they want on the devices and then flash it to their satellite.

In 2010 a group of engineers put two Nexus One devices into high altitude rockets to see if they could handle the extreme forces of launching. One of the Nexus One devices was destroyed when its parachute did not deploy, but the other Nexus One landed and was in perfect working condition. Both devices recorded data during the entire ride.

Watch this Youtube Video Here