Yes, the FCC might ban your operating system

prpl.works

fcc-logo_blackOver the last few weeks a discussion has flourished over the FCC’s Notification of Proposed Rule Making (NPRM) on modular transmitters and electronic labels for wireless devices. Some folks have felt that the phrasing has been too Chicken-Little-like and that the FCC’s proposal doesn’t affect the ability to install free, libre or open source operating system. The FCC in fact says their proposal has no effect on open source operating systems or open source in general. The FCC is undoubtedly wrong.

I want to make something entirely clear: I believe the FCC has the best of intentions. I believe they want to protect the radio spectrum and implement the E-LABEL Act as required by Congress. I believe they want to protect innovation in the technology industry. I also believe that their proposal harms innovation, endangers the free, libre and open source community and is generally anti-user.

View original post 3,946 more words

Harvard Researchers Build $10 Robot That Can Teach Kids to Code

image

Mike Rubenstein wants to put robots in the classroom.

Working with two other researchers at Harvard University, Rubenstein recently created what they call AERobot, a bot that can help teach programming and artificial intelligence to middle school kids and high schoolers. That may seem like a rather expensive luxury for most schools, but it’s not. It costs just $10.70. The hope is that it can help push more kids into STEM, studies involving science, technology, engineering, and math.

The tool is part of a widespread effort to teach programming and other computer skills to more children, at earlier stages. It’s called the code literacy movement, and it includes everything from new and simpler programming languages to children’s books that teach coding concepts.

Rubenstein’s project grew out of the 2014 AFRON Challenge, held back in January, which called for researchers to design low-cost robotic systems for education in the developing world. Part of Harvard’s Self-Organizing Systems Research Group, Rubstein has long studied swarm robotics, which aims to create herds of tiny robots that can behave as whole, and he ended up adapting one of his swarm systems in order to build AERobot. It’s a single machine—not a swarm bot—but it’s built from many of the same inexpensive materials.

Source: Wired

What does URL stand for?

image

According to Pew Research 69% of internet users know what URL stands for. Do you know what it stands for?

Source: Pew Research

This World Map Shows Every Device Connected To The Internet

image

A striking map created by John Matherly at search engine Shodan shows significant disparities in internet access across the world.

The graphic maps every device that’s directly connected to the internet. We first noticed it when geopolitical expert Ian Bremmer tweeted it.

Some of the dark spots on the map could be attributed to low population density in those areas, but by looking at the map it’s clear that internet access isn’t equal across the world.

The different colors indicate the density of devices — blue indicates fewer devices and red indicates more devices at a given location.

As you can see from the map, the US and Europe have very high levels of internet connectivity, with the exception of the less-populated areas of the western US. Africa is mostly an internet blackout, and Asia has much less internet connectivity than Europe and the US despite having very dense population centers in some areas.

Matherly told Business Insider how he put the map together (at least for a tech guy):

The way it was performed is fairly straightforward:

1. Use a stateless scanner to send a Ping request to every public IPv4 address

2. Keep track of which IPs responded with a Pong

3. Find out where the IP is physically located using a GeoIP library (i.e. translates from x.x.x.x -> latitude/ longitude)

4. Draw the map

Steps 1-3 took about 5 hours and the final step took 12 hours. This is possible because nowadays we have the technology (stateless scanning) to very efficiently talk to millions of devices on the Internet at once.

Source: Business Insider

Kris Medlen to have second opinion

American Health Imaging (Cumming, GA.)

Kris Medlen to have second opinion

Updated: March 12, 2014, 12:31 PM ET

ESPN.com news services

Atlanta Braves right-hander Kris Medlen underwent an MRI exam Monday that revealed an “injury to the ligament” in his pitching elbow, according to general manager Frank Wren.

Wren said Medlen will seek a second opinion. According to The Associated Press, he is expected to meet this week with Dr. James Andrews, who performed Tommy John surgery on the pitcher in 2010.

[+] EnlargeKris Medlen
Brad Barr/USA TODAY SportsAn MRI on Kris Medlen’s elbow revealed an “injury to the ligament,” the Braves said. The team will seek a second opinion for its slated Opening Day starter.

 “His MRI showed injury to the ligament, but we don’t yet know the extent,” Wren said in a statement. “A diagnosis would be premature at this point. He will undergo further tests until we seek a second opinion. An MRI can…

View original post 456 more words

Herbal, Weight Loss Supplements, Energy Drink Associated With Liver Damage, Liver Failure

I shouldn’t drink as many different energy drinks as I do.

American Health Imaging (Cumming, GA.)

image

Severe liver damage, and even failure, has been associated with the consumption of weight loss supplements, an herbal supplement and an energy drink, according to four separate case reports presented at the American College of Gastroenterology’s 78th Annual Scientific Meeting in San Diego, CA. Use of herbal and dietary supplements is widespread for a variety of health problems. Because many patients do not disclose supplement use to their physicians, important drug side effects can be missed.

Case Report 1: SlimQuick™- Associated Hepatotoxicity Resulting in Fulminant Liver Failure

There have been many reports of toxicity associated with dietary supplement use over the years, some with severe and even fatal outcomes. Lead investigator Dina Halegoua-De Marzio, M.D., reported a rare case of fulminant liver failure associated with the ingestion of SlimQuick™, a weight loss supplement containing green tea extract.

A 52-year old female patient was admitted to the emergency room after one…

View original post 1,085 more words

PiP Uses Facial Recognition To Reunite Lost Pets With Their Owners

image

Having your dog or cat run away is pretty traumatic. And even if somebody finds your furry friend, they might not know where to find you. If your pet ends up in a shelter, chances are high that it will be euthanized, so Philip Rooyakkers, the CEO of PiP – The Pet Recognition Company, decided to see if he and his team could use facial recognition instead of tags to more easily report and find more lost pets.

PiP launched its Indiegogo campaign. The company is looking to raise $100,000 in the next month to raise the final funds necessary to bring the app to market.

I ran into Rooyakkers at the GROW conference in Vancouver last week, and he told me that the company’s technology, developed by the image-recognition expert Dr. Daesik Jang, is able to recognize 98 percent of dogs and cats. With the help of some extra metadata (breed, size, weight, gender, colors etc.), this means PiP can recognize virtually every lost pet.

Anybody can download the app to report found pets. Pet owners pay a subscription to PiP (the plan is to charge $1.49 per month, with 2 percent of all proceeds going to local pet rescue charities) and the moment their pet goes missing, PiP will alert local animal control and rescue agencies, veterinarians and social medial outlets.

This “Amber Alert” for missing pets is at the core of what the service does. It will also scan social media for postings about found pets. “We will not only broadcast across all social media that the pet is missing, but everyone with the app (in that locale) will get a pop-up Amber Alert. We will contact the owner directly to listen, provide PiP’s immediate response, and offer support,“ Rooyakkers said in a statement today.

Whenever a pet is found, PiP will use its facial recognition software to see if it can find a match in its system. To avoid false positives, Rooyakkers told me, somebody will always look at the metadata to ensure everything checks out.

Obviously, there are a few other ways to identify lost pets, including ID Tags and Microchip Implants. However, there are numerous standards for microchips, so not every shelter or clinic can scan every chip. Facial recognition would also allow anybody to scan dogs or cats right after finding them without the need for any special equipment, which should make reuniting them with their owners faster and easier.

For more information click the source link.

Source: TechCrunch

Oculus Rift + Microsoft Kinect = full-on Virtual Reality?

image

The ledge I’m standing on has a strange existential duality. In the physical realm, it’s a thin strip of red, millimeters above the floor of a pristine white booth in a basement in Shoreditch, London where the 3D tinkerers and technologists (of everything from 3D film to 3D printing) at Inition keep their toys. In the digital realm, which, thanks to the Oculus Rift wrapped around my head, my senses have decided is the more real, the ledge is the only thing between me and a 300-foot plunge.

The voice from the other realm telling me to reach forward with my arms belongs to Inition founder Andy Millns. He’s concerned I’m going to bang my head (or perhaps his Oculus Rift) against the booth wall. That’s easy for him to say. My arms are otherwise engaged in an inept flailing in a simultaneous attempt to not fall off (inside the game, a fail state) or over (inside the booth, an ultra-fail state).

This isn’t Gizmag’s first play with an Oculus Rift. Back in February, Jonathan looked at a pre-launch version. Today, two things are different. Firstly, Inition’s Rift is the finished article (the current developer model, at any rate), and secondly, much more significantly, Inition has wired its Rift up to a Kinect, via a computer running the company’s in-house VR vertigo simulator, that is. To get across that ledge I can’t just push up on a thumbstick, or a W key. I physically have to walk. Or jump, as a previous tester (or perhaps victim) apparently attempted, having abandoned reality outright.

This is proper virtual reality, in other words, albeit it a compact form. The demo begins in a room which, unlike the ledge, I am not free to navigate. I can turn my head, of course, to examine a virtual chandelier, or to look out of a virtual window. As I’d come to hope, latency was all but imperceptible. As I’m impelled across the room by an external force (i.e. someone at Inition operating a keyboard), I come to face a door. The room, it turns out, was at the top of a skyscraper, built very close to another skyscraper which is inevitably though somewhat inexplicably connected by said ledge.

Now I’m free to move, and though, deep down, I’m perfectly content to observe proceedings from the doorway, it seems rude not to try to cross. The Kinect, looking down at me from above, can see the bright red ledge and map my progress across it: Inition’s demo simultaneously Augmented and Virtual Reality. Somehow, I manage to get to the other side without falling, and ready myself for the return journey (all 5 feet of it). But by now the effort of not falling off or falling over is overwhelming, and with one self-righting misstep, I plunge from the ledge and come crashing down to Earth without a thump, there to admire the virtual grass.

It’s great fun, and if I had difficulty, it may have been down to my unwillingness to let go of reality. As I lowered the Rift over my eyes, my brain clung on to the visual memory of the red ledge, conscious that even the minuscule difference in height could cause me to trip. I became convinced, rightly or wrongly, that where the Rift was telling me the ledge was didn’t match its actual location. Practice doubtless helps, but a safe playing environment will be essential for people to immerse themselves fully.

Coincidentally, that’s precisely the intention of Julian Williams, CEO of Wizdish. As part of Inition’s current AR vs VR event, part of the Digital Shoreditch festival, Williams is showing off his invention, which, accompanied by another Kinect sensor and Oculus Rift, lets people navigate a VR space by donning special shoes and sliding their feet over the slippery dish. Spotting an opportunity for more inept flailing, I gave it a whirl.

image

This time a Kinect was trained on my ankles. When detecting a walking motion (or something like it), the demo moved me forward in the direction I was looking. The VR itself was rudimentary, but the point here is that the Wizdish does a good job of allowing users to walk about in a virtual space without the worry of bumping into things. The combination of shoes and Wizdish does take some getting used to, but even the few minutes I spent skidding about the thing were sufficient to tell that using it would soon become second nature. The challenge future games makers face is to get the Kinect to determine which way the gamer is facing.

In one final effort to completely freak me out, Millns introduced me to Mark Lewis of Animazoo, makers of the IGS Glove. It’s an electronic glove which can track the motion of hands and fingers using inertial gyros without need of a camera (or Kinect sensor for that matter). Lewis invited me to place my hand on the “chopping block” in front of me. “You’re not afraid of electric shocks are you?” Millns quipped. He’s such a kidder. Still, I couldn’t help but think “oh dear” as I pulled another Rift over my eyes. At least this time I’d get to sit down.

image

“Nice statue,” I said, pointing vaguely ahead of me, forgetting that so far as Millns and Lewis were concerned, I was pointing at Julian Williams and his Wizdish at the other side of the room. It was then that I caught a glimpse of my hand, or its digital proxy. “You’ll notice a few fingers are already missing,” said Lewis. Thank you, yes, I had noticed that. What I was only just beginning to notice was the bloodied guillotine just above me.

It would be an exaggeration to say that my rational mind (what there is of it) had to overpower my instincts in order to place my hand under the guillotine, but this demo certainly has the power to disconcert. It’s not so much the drop of the blade as the anticipation of it, though Lewis gently touching my wrist to coincide with the incision of the blade was certainly effective. I had been expecting to lose another finger or two. Instead my whole hand had gone.

If the Oculus Rift demos by Inition and friends tell us anything, it’s that though the device may be well suited to standard video games, it has much greater potential for immersion when combined with a dedicated, safe environment (as with the vertigo demo) or when complemented by other technology like Kinect, the Wizdish and IGS Glove. If there were shortcomings in any of the demos, the limiting factor seemed to be the Kinect, not the Rift. And the Kinect, we’re told, has been greatly improved for Xbox One. Whether it will allow accurate tracking of body motion is perhaps doubtful, but it’s precisely this that the Rift is crying out for. Otherwise, barring a resolution bump or two, the Oculus Rift itself isn’t far away from perfection.

Source: Gizmag

Benghazi Talking Points Timeline

Clearly there is not much more to be assumed besides there is a major cover up in the Benghazi tragedy. Just look at the altered talking points in the PDF below, they clearly show such a starch contrast to the original assessment. Make sure to read the entire list through and see the changes that have been made, demand answers! Who changed these talking points? Who was behind this blatant misinformation that was given to the American people? Please take the time to look through these released talking point alterations and try convincing yourself this has nothing to do with the fact that this was during a vital election and this clearly rules out the Obama Administration’s claims about The terrorists that are supposedly on their heels running scared.

PDF