October 5, 2010

DON’T WORRY, ROBOTS NOW GUARDING NUKES IN NEVADA

When you have a 1,360 square mile military facility that needs patrolling, robots are the way to go, or at least that’s the position starting to be adopted by the US Army. Why? Simple: using mobile robots instead of permanent infrastructure (like fixed cameras and motion detectors) saves $6 million in up-front costs plus an additional $1 million a year in maintenance. The robots being used at the Hawthorne Army Depot (which stores tens of millions of cubic feet of low-level radioactive waste) are somewhat unsexily called MDARS (Mobile Detection Assessment Response Systems). They’re diesel powered, with a top speed of 20 mph, and they can stay on duty for 16 hours. Most of the time, MDARS do random fully autonomous patrols, using RFID tags to check the status of locks and gates. If they notice something out of the ordinary, a human can take over, using cameras, microphones, and speakers to interact with whatever needs interacting with. The bots are currently unarmed, so short of running someone over they can’t do much more than observe, although they are certainly capable of mounting remote controlled weapons. [ NNSA Press Release ] VIA [ Danger Room ]
October 5, 2010

PANASONIC’S HAIR WASHING ROBOT

This hair washing robot from Panasonic made the rounds last week, but I figured it was one of those things where good video was important, and DigInfo News came through today. One thing that the video doesn’t elaborate on is how the robot is actually scanning the user’s head shape in three dimensions to figure out just the right amount of pressure to apply, and it’ll use that shape to remember who you are and what shampoo and massage settings you like. In general, Panasonic is trying to create a robotic infrastructure to help Japan (and the rest of the world) deal with an aging population that’s going to need more and more support. As such, this system is primarily targeted at medical environments and not for installing in your bathroom. Yet. [ Press Release ] VIA [ DigInfo ]
October 4, 2010

YOUTUBE – CORDYCEPS FUNGUS

Shared by Daniel Attenborough forever
October 3, 2010

KID’S WALKER FULFILLS YOUR CHILD’S DREAM OF PILOTING A KICKASS ROBOT SUIT (VIDEO)

Shared by Daniel Kids born in 2010 have it pretty good. Four years ago Sakakibara Kikai brought us a $300,000 real-life BattleMech, and the company hasn’t sat idle since then — last December, it put the final touches on a significantly smaller exoskeleton designed specifically for children. The Kid’s Walker stands just over five feet tall and weighs four hundred pounds, and though the gasoline-powered creature doesn’t exactly walk, its wheeled feet definitely stroll around. The Japanese company told Gizmag the suit isn’t presently for sale — just rentals for now — but would probably cost about 1.8 million yen (around $21,600) should it come to market. If you ask us, that’s a small price to pay; everyone knows it’s always the young mecha pilots that end up saving the world. Video after the break. Continue reading Kid’s Walker fulfills your child’s dream of piloting a kickass robot suit (video) Kid’s Walker fulfills your child’s dream of piloting a kickass robot suit (video) originally appeared on Engadget on Sat, 02 Oct 2010 15:46:00 EDT. Please see our terms for use of feeds. Permalink DVICE | source Sakakibara-Kikai | Email this | Comments
October 2, 2010

800PX-INTERNET_PENETRATION.PNG

http://upload.wikimedia.org/wikipedia/commons/thumb/a/af/Internet_Penetration.png/800px-Internet_Penetration.png
October 1, 2010

UMBILICAL CABLE BIRTHS AN IPHONE (VIDEO)

Shared by Daniel Cronenberg would be proud And here we thought Steven P. Jobs, not media artist Mio I-zawa, was responsible for creating the iPhone. Continue reading Umbilical cable births an iPhone (video) Umbilical cable births an iPhone (video) originally appeared on Engadget on Fri, 01 Oct 2010 08:29:00 EST. Please see our terms for use of feeds. Permalink PinkPentacle | source YunaDigick (YouTube) | Email this | Comments
October 1, 2010

RECON-ZEAL TRANSCEND GOGGLES NOW SHIPPING, GPS AND HEAD-MOUNTED DISPLAY INCLUDED

Shared by Daniel if anyone wants to get me a present for BM next year… Don’t you just love it when a plan comes together? If you’ll recall, we heard that Recon Instruments was fixing to up-end the winter sports goggle market in February of this year, with an optimistic-at-the-time ship date of October 2010. Lo and behold, the outfit has managed to nail its estimate, and the planet’s first GPS-enabled goggles are now available to highfalutin’ skiers and snowboarders. At least initially, the company will be rolling out a limited set, with two models to choose from: the $499 Transcend SPPX is fitted with an SPPX polarized and photochromic lens, while the $399 Transcend SPX features an SPX polarized lens. Aside from the fact that these probably cost less than those ho hum Oakleys in the ski shop, they’re equipped with a Zeal Optics’ frame design with a micro LCD display, which appears to hang approximately six feet in front of the user. That head-mounted display provides real-time feedback to the wearer, including speed, latitude / longitude, altitude, vertical distance traveled, total distance traveled, a chrono / stopwatch mode, a run-counter, temperature and time. Yeah, wow. You can bet we’ll be trying to snag a set for review when we do our best impression of “hitting the slopes” post-CES. Gallery: Recon-Zeal Transcend goggles Continue reading Recon-Zeal Transcend goggles now shipping, GPS and head-mounted display included Recon-Zeal Transcend goggles now shipping, GPS and head-mounted display included originally appeared on Engadget on Fri, 01 Oct 2010 15:19:00 EDT. Please see our terms for use of feeds. Permalink | source Recon Instruments | Email this | Comments
September 30, 2010

IROBOT PATENT SHOWS ‘CELESTIAL NAVIGATION SYSTEM’

iRobot has had a virtual monopoly on the consumer robot vacuum market since they introduced the Roomba in 2002. But with some new competition this year, there’s now a real perception problem when consumers compare a Roomba’s cleaning technique to that of Mint or the Neato XV-11, which are able to localize themselves, map a room, and clean in straight efficient lines. Irrespective of whether ’smart’ cleaning is more or less effective (and iRobot argues convincingly that its unstructured patterns do in fact clean better), Roomba’s pseudo-random behaviors seem less sophisticated and ‘dumber’ by comparison. The Roomba, which (I would argue) hasn’t seen a significant upgrade since the release of the third generation 5xx series in 2007, is going to need some kind of upgrade, because consumers are now expecting household robots to be smarter. This patent application, last updated in April of this year and unearthed by Robot Stock News, seems to suggest that iRobot is at least considering adding localization to their line of cleaning robots. The patent is for a “Celestial Navigation System for an Autonomous Robot,” and works by using (and stop me if you’ve heard this one) a projector to put IR spots on your ceiling that the robot uses to figure out where it is. Yep, sounds a lot (suspiciously a lot) like NorthStar, used by Mint. And just like NorthStar, if iRobot implements this system it’s going to mean that you’d need a projector in every room you want cleaned. The Celestial Navigation patent does talk about a bunch of potentially interesting features… For example, each room gets its own ID, so you could schedule your robot to clean specific rooms at specific times, and then monitor its progress on a remote. And there’s even the suggestion that the beacons will be powered […]
September 29, 2010

YOUTUBE – MIND READING BREAKTHROUGH –BASIC ENGRAMS MAPPED

Shared by Daniel Probably worth recording for posterity.
September 27, 2010

EPFL DEVELOPS LINUX-BASED SWARMING MICRO AIR VEHICLES

The kids at Ecole Polytechnique Federale de Lausanne (or EPFL) have been cooking up quite a bit lately, as this video demonstrates. Not only have they put together a scalable system that will let any flying robot perch in a tree or similar structure, but now they’ve gone and developed a platform for swarming air vehicles (with Linux, nonetheless). Said to be the largest network of its kind, the ten SMAVNET swarm members control their own altitude, airspeed, and turn rate based on input from the onboard gyroscope and pressure sensors. The goal is to develop low cost devices that can be deployed in disaster areas to creat ad hoc communications networks, although we can’t help but think this would make the best Christmas present ever. See for yourself after the break. Continue reading EPFL develops Linux-based swarming micro air vehicles EPFL develops Linux-based swarming micro air vehicles originally appeared on Engadget on Mon, 27 Sep 2010 12:14:00 EDT. Please see our terms for use of feeds. Permalink Make | source EPFL | Email this | Comments
September 27, 2010

RAYTHEON REVAMPS SARCOS EXOSKELETON, CREATES BETTER, FASTER AND STRONGER XOS 2 (VIDEO)

When we first laid eyes on the Sarcos XOS military exoskeleton three years ago, its sheer power and dexterity left us in awe… but as you can see immediately above, that wasn’t enough for Raytheon. Today, the defense contractor’s unveiling the XOS 2, a lighter, stronger robotic suit that uses 50 percent less power for dropping and giving us several hundred pushups. Video and a press release after the break don’t specify the suit’s military duties (they’re focused on instilling the notion that the XOS 2 is a real-life Iron Man) but we can definitely imagine these causing some serious damage if Hammer Industries decided to weaponize that high-pressure hydraulic frame. Update: We previously stated that the suit didn’t need to be tethered to a power source for operation, but that information was incorrect. [Thanks, SmoothMarx] Gallery: Raytheon Sarcos XOS 2 exoskeleton – press pics Continue reading Raytheon revamps Sarcos exoskeleton, creates better, faster and stronger XOS 2 (video) Raytheon revamps Sarcos exoskeleton, creates better, faster and stronger XOS 2 (video) originally appeared on Engadget on Mon, 27 Sep 2010 09:01:00 EDT. Please see our terms for use of feeds. Permalink | | Email this | Comments
September 27, 2010

GIMME ROBOT WILL CREEP YOU OUT UNTIL YOU GIVE IT MONEY

Have you ever felt uncomfortable asking people for money? So has Chris Eckert, so he built a robot to help him out, named Gimme: Gimme is a two axis numerically controlled sculpture that pans a room looking for people. Once found, the machine tracks a person, cajoles them into making a donation, and resumes scanning the room searching for potential donors. The sculpture is controlled by an Arduino Pro Mini. Stepper motors are driven by two Pololu A4983 Stepper Motor Driver Carriers. The microcontroller, stepper drivers, and sensors are all mounted on a custom circuit board made with Eagle CAD. While I absolutely love the construction of this robot, I’m not quite sure what to make of the bare eyeball… Part of me says “cute!” Part of me says “get it away from me!” But for the purposes of this robot, that’s probably perfect: first it draws you in, and then when it’s too late, you have to pay it to get it to leave you alone. Genius. [ Gimme ] VIA [ Make ]
December 5, 2009

BODY LANGUAGE

This is old news but talk of Google’s Public DNS brought up this bit of data: Marissa ran an experiment where Google increased the number of search results to thirty. Traffic and revenue from Google searchers in the experimental group dropped by 20%. Ouch. Why? Why, when users had asked for this, did they seem to hate it? After a bit of looking, Marissa explained that they found an uncontrolled variable. The page with 10 results took .4 seconds to generate. The page with 30 results took .9 seconds. Half a second delay caused a 20% drop in traffic. Half a second delay killed user satisfaction. Just a friendly reminder that computers are not pure syntax manipulators; they are embodied systems with complex non-formal behavior to which we are highly sensitive.
December 6, 2009

HELLA DROP SHADOW

I just read an excellent article called The Dark Side of Digital Backchannels in Shared Physical Spaces. I have nothing to really add to the analysis, except to say that these are circles I wish I traveled in. I should move to Silicon Valley and become a freelance philosopher. The article also references the Online Disinhibition Effect, which I had somehow forgotten to mention in my classes this semester, so I was grateful for the reminder. The Wikipedia entry for online inhibition effect lists six components: You Don’t Know Me (Dissociative anonymity) You Can’t See Me (Invisibility) See You Later (Asynchronicity) It’s All in My Head (Solipsistic Introjection) It’s Just a Game (Dissociative Imagination) We’re Equals (Minimizing Authority) However, when online tools are used in shared physical spaces, they transform them into what Adriana de Souza e Silva and others call hybrid spaces. In such spaces, the first four components are not as relevant or applicable, and so the hybrid inhibition effect may only involve the last two, and I think the one that best explains the Twittermobbing at conferences is the last one. Perhaps I am too deep into my research to see outside my own little world, but it strikes me that one might plausibly interpret Turing’s test as an endorsement of disinhibition in the last two senses: that we ought to treat our interactions with some machines as a game among equals, contrary to our normal biases against machines. In other words, although the online disinhibition effect is often discussed as a negative consequence of shared digital spaces (Wikipedia links its article to antisocial personality disorder, for instance), it is important to remember that sometimes disinhibition can be a virtue, especially when the norms that inhibit us are themselves negative and stifling.
December 18, 2009

MURDER

January 25, 2010

I NEED TO BELIEVE

Hadn’t seen this yet, recording for posterity.
February 1, 2010

WHY ROBOTICS IS LESS IMPORTANT THAN AI

Augmented (hyper)Reality: Domestic Robocop from Keiichi Matsuda on Vimeo.
February 26, 2010

ON CHALMERS

David Chalmers at Singularity Summit 2009 — Simulation and the Singularity. First, an uncontroversial assumption: humans are machines. We are machines that create other machines, and as Chalmers points out, all that is necessary for an ‘intelligence explosion’ is that the machines we create have the ability to create still better machines. In the arguments below, let G be this self-amplifying feature, and let M1 be human machines. The following arguments unpack some further features of the Singularity argument that Chalmers doesn’t explore directly. I think, when made explicit and taken together, these show Chalmers’ approach to the singularity to be untenable, and his ethical worries to be unfounded. The Obsolescence Argument: (O1) Machine M1 builds machine M2 of greater G than M1. (O2) Thus, M2 is capable of creating machine M3 of greater G than M2, leaving M1 “far behind”. (O3) Thus, M1 is rendered obsolete. A machine is rendered obsolete relative to a task if it can no longer meaningfully contribute to that task. Since the task under consideration here is “creating greater intelligence”, and since M2 can perform this task better than M1, then M1 no longer has anything to contribute. Thus, M1 is ‘left behind’ in the task of creating greater G. The obsolescence argument is at the heart of the ethical worries surrounding the Singularity, and is explicit in Good’s quote. Worries that advanced machines will harm us or take over the world may be implications of this conclusion, but not necessarily so. However, obsolescence does seem to follow necessarily from an intelligence explosion, and this on its own may be cause for alarm. The No Precedence Argument: (NP1) M1 was not built by any prior machine M0. In other words, M1 is not itself the result of exploding G. (NP2) Thus, when M1 builds […]
March 9, 2010

AR SCREENING

Been arguing about AR, archiving for posterity. Augmented reality Augmented reality will be the most important technological and social change since the widespread adoption of the internet. The internet has been around for decades, but it wasn’t until computing hardware was ubiquitous that the technology was able to serve as a platform for radical social, political, and economic change. Similarly, AR technologies have been around for a while, but only now is the hardware ubiquitous. Everyone is carrying computers in their pocket, computers that are networked and equipped with cameras and GPS, and are about as powerful as the PCs that fueled the first few years of internet. My personal hope is that the Hollywood-backed push for 3D multimedia will promote the widespread use of “smart glasses”, connected by Bluetooth to the smart phone in your pocket, with a HUD for fully-immersive, always-on AR. The technology is already there, or close enough for early adopters, it just all needs to get hooked up in the right way. AR tattoos Your face is a social business card 3D AR on the fly From image to interactive 3D model in 5 minutes Photosynth + AR] Arhrrrr The future of advertising Ali G QR Google Translate Hand from above Projection on Buildings Pinball The Ladder is a mixed-reality installation. The room is plain apart from a window, cut high into the wall and a ladder. A tiny virtual character, that can only be seen through the computer screen, stands on a ladder and looks out of the window to the physical world. He keeps voicing concerns as to the nature of the world, tracing shapes with his hands and trying to describe the scene. The screen is on a rig so that you can pan it across the room but the boy stays […]
March 15, 2010

FREEDOM

May 2, 2010

NATURAL LANGUAGE

Can someone explain this comment to me? It sounds almost like something I’d say, but in the mouth of someone else I have no idea what it means. “Humans are good with language,” says Boris Katz, lead research scientist at MIT’s Computer Science and Artificial Intelligence Laboratory, the principle group working with Nokia. “We want language to be a first-rate citizen” on cell phones, he says. |link|
May 12, 2010

WHO CARES ABOUT INTEGRITY

I decided to start importing my Google Reader shared items directly into this blog using FeedWordPress. The reason for this change is that my blog has always served as a repository for examples and material to use in class. Currently my shared items are broadcast only through Reader itself, and through Buzz, but everyone I tell to use Buzz looks at me like a cultist. In any case, it would be nice to have access to the material without signing into my Google account. That means a lot of stuff posted here won’t be my own writing, but it will be clearly marked as not my own, and will be tagged as autoblogging. I’ll still post my own occasionally, but it would be nice to have my blog active in at least some capacity instead of just sitting dead. I used to import my blog posts onto Facebook, but I don’t want to overload the news stream with my shared items, so I’ve stopped importing into Facebook. That means when I write a post by hand, I’ll have to bring it to Facebook manually; but that’s ok because I’m not writing much anymore anyway. And I don’t expect Facebook to stick around much longer. This might result in some of my (two) readers receiving multiple copies of my posts in various media streams. If it is annoying and there is anything I can do to help it, let me know.
.twitter-timeline.twitter-timeline-rendered { position: relative !important; left: 50%; transform: translate(-50%, 0); }