Tag Archives: Adafruit

Street Fighter II in AR is literally played on the streets

Via Cnet:

Last year, I played Super Mario in AR on a Hololens. The maker of that game, Abhishek Singh, told me he would be focusing on mobile games and ARKit in the future. Now he has another AR gaming idea he’s brewed up: Street Fighter II in AR, using ARKit on an iPhone, so you can literally fight in the street. And it looks… awesome.

Apple’s growing selection of ARKit games has lots of options. There’s no actual Street Fighter game, though, and not that many arcade fighting games at all.

His YouTube video shows him playing the game in city streets, on tables and in parking lots. And yes, there’s even a car-demolishing bonus round. You can’t download it, and it isn’t an official app — but maybe it should be.

Read more!

via Adafruit

Gates of Light: Architectural solutions to light pollution

Via Earther:

The Afsluitdijk is a 20-mile dam that has been protecting the low-lying Netherlands from the force of the ocean for decades. According to Daan Roosegaarde, a Dutch artist who works in urban environments, it’s quite famous in the country because “basically, it protects us from drowning.”

Roosegaade recently installed the “Gates on Light” on 60 massive floodgates split between both ends of the dike. Because of sea level rise, the structure is in need of renovation. Roosegaade wanted to use this opportunity to draw attention to the importance of the dam—which many take for granted—as well as to another environmental issue: light pollution.

According to the World Atlas of Artificial Night Sky Brightness, more than 80 percent of the planet’s land area, and 99 percent of the populations of the United States and Europe, look up to skies so polluted with light that the Milky Way is virtually invisible. Amsterdam’s skies are some of the most light-polluted in the world.

See more at Earther and on YouTube

via Adafruit

Graphene Hair Dye Could be Coming!

3 new graphene hair dye promises perfect hair

Via fastcodesign

A research team at Northwestern University has discovered a way to use sheets of graphene to dye hair. Unlike current chemical hair coloring products, the scientists report in the journal Chem that their new dye is nontoxic, antibacterial, antistatic, and you can apply it yourself with a spray. It looks like we can add “the holy grail of hair color” to graphene’s seemingly endless list of applications.

This is how it works: The user applies the graphene dye using a spray, then brushes the hair and dries it. The graphene forms a gentle film around each and every hair strand. Like in a sci-fi movie, your hair will change color before your very eyes as the sheets of graphene attach themselves to your mane. And since the research team says their method doesn’t require toxic solvents, or molecular ingredients, or extreme heat, you don’t have to worry about damaging your hair, skin, or yourself. The color lasts for at least 30 washes, like what you expect from any conventional chemical-based dye. The graphene material will disappear leaving your hair in the exact same state as it was when you applied it.

Your graphene-enhanced superhero hair will also have some other super powers. First, it’s anti-static–so you can say goodbye to flyaway hairs. Secondly, it’s antibacterial–your hair will stay cleaner longer. Third: thermal regulation capabilities. In theory, your graphene-enhanced hair will be able to regulate the heat on your head better than your regular hair.

The fourth power is quite intriguing. The Northwestern team mentions that your graphene-treated hair will be able to interface with electronic components, since the coating can carry an electrical current. I can’t imagine the potential applications for this one–beyond adding LED beads that could display different colors depending on the thermal conditions of your scalp (purple for anger, for instance, or green for happiness).

See more!

via Adafruit

New Orleans Ends Its Palantir Predictive Policing Program

via The Verge

Two weeks ago, The Verge reported the existence of a six-year predictive policing collaboration between the New Orleans Police Department and Palantir Technologies, a data mining giant co-founded by Peter Thiel. The nature of the partnership, which used Palantir’s network-analysis software to identify potential aggressors and victims of violence, was unknown to the public and key members of the city council prior to publication of The Verge’s findings.

Yesterday, outgoing New Orleans Mayor Mitch Landrieu’s press office told the Times-Picayune that his office would not renew its pro bono contract with Palantir, which has been extended three times since 2012. The remarks were the first from Landrieu’s office concerning Palantir’s work with the NOPD. The mayor did not respond to repeated requests for comment from The Verge for the February 28th article, done in partnership with Investigative Fund, or from local media since news of the partnership broke.

There is also potential legal fallout from the revelation of New Orleans’ partnership with Palantir. Several defense attorneys interviewed by The Verge, including lawyers who represented people accused of membership in gangs that, according to documents and interviews, were identified at least in part through the use of Palantir software, said they had never heard of the partnership nor seen any discovery evidence referencing Palantir’s use by the NOPD.

Yesterday, Orleans Criminal District Court Judge Camille Buras agreed to hear a motion from Kentrell Hickerson challenging his racketeering and drug conspiracy convictions. Hickerson’s attorney, Kevin Vogeltanz, filed his motion with the court on March 8th, citing the nondisclosure of any relevant intelligence from Palantir about his client’s alleged involvement in the 3NG street gang as a potential violation of Hickerson’s rights. Under the Supreme Court case Brady v. Maryland, defendants have the right to procure any and all potentially exculpatory evidence assembled against them by law enforcement.

Read more!

via Adafruit

Predicting Papaya Ripeness with Computer Vision Algorithm


From IEEE Spectrum via Science Direct:

The University of Campinas researchers teamed up with computer scientists from Londrina State University in Londrina, Brazil to develop the machine learning approach that achieved an overall ripeness detection accuracy of 94.7 percent. Their work appears in the February issue of the journal Computers and Electronics in Agriculture.

Measuring ripeness—and identifying relevant features for ripeness—was one of the biggest challenges. The researchers started out with a government guidance chart that listed five levels of papaya ripeness. But they soon consolidated ripeness levels into three maturity levels based on visual inspection: Visually, the outer peel of the golden papayas starts out green and yellows as the fruit ripens. They further verified the three levels with additional testing based on each fruit’s pulp firmness.

Training the machine learning algorithm also proved an unexpected challenge: It required a diverse selection of papayas. Researchers had hoped to get a large number of papayas from a local producer but eventually found themselves buying 57 golden papayas at a local market in Campinas.

Both the hardware and software components of the project proved relatively straightforward. On the hardware side, researchers built a boxy contraption with a consumer digital camera and light bulbs positioned on the ceiling to take illuminated pictures of the papaya samples. Success with such consumer-grade technology means this approach could be adapted fairly readily to commercial applications.

On the software side, the researchers considered a number of different machine learning algorithms before settling upon the common random forest classifier. This approach enabled the researchers to clearly see how different papaya features factored into the machine learning algorithm’s results. “We could see which features are really providing useful information about the fruit,” Barbin explains.

A deep learning approach based on neural networks also might have yielded good results for visually identifying ripe papayas. But the Londrina State University colleagues were wary of the black box nature of deep learning algorithms that usually makes it extremely difficult to figure out how deep learning comes up with any given result. Furthermore, a deep learning approach would have required a potentially far greater sample of papayas in the training dataset to achieve reasonable accuracy.

Read more from IEEE Spectrum and Science Direct

via Adafruit

Star Wars Lightsaber Bionic Arm from @openbionics #SciFiSunday

Absolutely wonderful!

via Adafruit

ROVER the singing robot #piday #raspberrypi @Raspberry_Pi

Rover the Singing Robot from UC Santa Barbara on Vimeo.

UC Santa Barbara grad student, Hannah Wolf created ROVER. This awesome project was built with a Roomba, some sensors, an Arduino and of course, a Raspberry Pi.

Via Nanowerk:

“I wanted to create an interactive art installation that would make people happy,” said Wolfe, who is based in the rather austere environment at UCSB’s Elings Hall. “I thought if something came up to people and sang to them that would bring joy into the space.”

With a Roomba (sans vacuum) as its feet, a Raspberry Pi as its brain and an Arduino acting as a nervous system, Rover is more than an artistic endeavor, however. It is also a tool for investigating the ways humans respond to robots.

Think about it: We ask iPhone’s Siri or Amazon’s Alexa for help accessing information, making purchases or operating Bluetooth-enabled devices in our homes. We have given robots jobs that are too tedious, too precise or too dangerous for humans. And artificial intelligence is now becoming the norm in everything from recommendations in music and video streaming services to medical diagnostics.

“Whether we like it or not, we’re going to be interacting with robots,” Wolfe said. “So, we need to think about how we will interact with them and how they will convey information to us.”

To that end, Wolfe has ROVER generate sounds — beeps and chirps and digital effects (think R2D2) — that she found elicit positive versus negative reactions from people. Meanwhile an onboard camera records how individuals respond to ROVER.

Read more!

3055 06Each Friday is PiDay here at Adafruit! Be sure to check out our posts, tutorials and new Raspberry Pi related products. Adafruit has the largest and best selection of Raspberry Pi accessories and all the code & tutorials to get you up and running in no time!

via Adafruit