Apple backtracks on Flash restrictions

Apple says it will allow app developers to more easily transfer Flash-based applications to the iPhone system, under certain circumstances.

In a statement released today, the company said it was “relaxing all restrictions on the development tools used to create iOS apps, as long as the resulting apps do not download any code.” It’s billing this as a solution to “give developers the flexibility they want, while preserving the security we need.”

While the change will affect a range of tools and systems, the biggest effect is likely to be that it allows companies that develop apps in Adobe’s Flash to more easily convert them to run on iPhones without the expense of effectively starting from scratch.

The change may be enough to stave off an apparent Federal Trade Commission investigation into whether banning apps converted from Flash is an illegal anticompetitive move. There certainly doesn’t seem to be any question of Apple being forced to allow Flash itself to run on iPhone devices, no matter how much it might upset Adobe.

Apple has also announced it will be publishing its review guidelines that decide which apps are allowed in the iTunes store. As we’ve covered on several occasions, the past mystery over the process, along with the seeming inconsistency between what is and isn’t allowed on taste grounds, has left many developers in the dark, most notably one who had to guess what wording was acceptable to describe the various options in a flatulence simulator.

My money is still on the review process being a monkey throwing banana skins into one of two piles, though I’m guessing Apple will use technical jargon to make that sound more impressive.

A Telepathic Wheelchair

A new neuroprosthetic interface – that is “it reads minds without drilling into skulls” – is being tested by scientists at the Ecole Polytechnique Federal De Lausanne’s Institute of Bioengineering.

The practical application – one can now pilot an electric wheelchair using only one’s mind – brings a new type of mobility to the fully paralyzed. Users’ specific brain patterns are recorded via electroencephalography (EEG) and then translated into commands given to the chair. To turn left, one imagines moving one’s left hand, for example.

This would be impressive enough, but to take some of the “thinking load” off the user, two small cameras, situated on each side of the chair, helps avoid obstacles, much like a roomba. Between the two techniques, it brings new hope for mobility.

It’s not ready for mass production yet, as there are difficulties in object recognition.

“The system requires advanced artificial intelligence—it will need to distinguish between different types of objects: furniture, people, and doorways. Carlson explains that “if it is a cabinet, the chair should be directed around it. But if it is a desk, the chair will have to recognize it and approach it appropriately.

In the future, the system will be able to interpret the user’s higher-level intentions. “We are trying to analyze different brain patterns, such as error-related potentials that may help to disambiguate the intentions of the user,” says Carlson. “Does the user want to avoid the desk or is it his, and should the chair pull up to it so he can work?”