You will have to take my word for it, but I am writing this with my eyes closed! I am using an amazing keyboard on my iPad, and I am able to do this after only about ten minutes of learning and practice.
Fleksy is a new method of text input that has been developed for blind people who use devices with touch screens. Fleksy’s developers are encouraging sighted users to give it a try as well. I believe Fleksy holds promise not just for the millions who want to enter text while walking or doing other tasks, but also for individuals with limited physical dexterity.
I’ve tried Fleksy on my iPad and found it to be remarkably effective. Controlled with a few simple gestures, it’s easy to learn and to use. The only prerequisite is familiarity with the layout of the QWERTY keyboard. Along with clear auditory feedback, Fleksy’s effectiveness is based in a powerful word prediction engine that is incredibly accurate.
Fleksy is currently available in the iTunes Store as a free download for iPhone and iPad. The free app enables the user to try Fleksy, but an in-app purchase for $1.99 enables text created in Fleksy to be sent and/or used elsewhere. Developers are inviting beta testers for an Android version.
Here’s a high quality promotional video from the developers.
Here’s a free iPad app that has achieved near perfection. You can now use the iPad camera, even the camera on an iPad 2, to photograph text and import it as editable text into PaperPort Notes. If the imported text retained the formatting of the original page, I think I’d call the app perfect.
PaperPort Noteswas already one of the most versatile and polished iOS apps available for supporting written output.
Text can be entered with keyboard, via voice-to-text, or with stylus. As well, audio recordings can be attached to notes.
Notes can be written on yellow or white lined pages, on blank white pages, or on “graph paper”.
PDF files, can be imported from almost anywhere–PaperPort Anywhere (dedicated free cloud storage), Box, Dropbox, Docs Folder, Files/Snapshots from the Web, the iPad’s Clipboard, camera images.
Text boxes and sticky notes can be added to notes and imported files.
Multi-color highlighting is available.
Work created or modified in PaperPort Notes can be shared in PaperPort Anywhere, by Email, Google Docs, Box, Dropbox, Docs Folder (with audio attached to PDF). Or PaperPort Notes can be opened in many other apps on the iPad.
Now you can use the iPad’s camera to capture text, then the text can be imported into PaperPort Notes as editable text. With ‘Speak Selection’, this text can even be read aloud. Alternatively, you can import images of text that are stored on the iPad. You need to sign up for a free account with the OCR engine, but once the account is set up, it could hardly be easier to import converted images of text.
To import an image of converted text, simply click on the ‘Image to Text’ button. Then follow the prompts on the screen that opens. After processing, the imported text appears on the panel at the right. If desired, the text can be edited before inserting into your note.
My most visited posts ever was 10 Apps for Learners Who Struggle with Reading and/or Writing (Feb. 2012). Since then, my list of iOS Apps has been expanding. When I’ve shared these apps in person, as I did this week on an ISTE webinar, the presentation has been well received. So, here’s my updated list of iOS apps, with point-form annotation. The same information is available on my UDL Resource website, and that’s where I will continue to update the list.
I don’t claim that the apps shared here are the only solution or even the best solution. I have spent considerable time and money exploring apps to support reading and writing, and these are apps that I have found to be effective. The 24 apps listed below are organized under the following headings: 1) Supports for Reading; 2) Supports for Writing; 3) Alternatives to Writing; 4) Research Supports; 5) Visual Supports; 6) Supports for Written Math Work.
A basic tenet of Universal Design–in architecture as well as in learning–is that if you design for people “in the margins”, it tends to benefit everyone. The most oft-cited example of this is the curb cut. Increasingly, the reverse is also true. Mainstream technology designed to improve functionality for everyone is especially helpful for individuals who face exceptional challenges. The advances in voice recognition, or speech-to-text, offer a striking example of this.
The latest version of the Google Search app for both iOS and Android devices shows just how far mainstream voice recognition has come! Asking a question yields not only search results, but in many instances you can also hear a spoken answer. When I tried the app on my iPad, I was blown away by the accuracy and the speed of this powerful tool. Many individuals, who face a wide range of challenges will benefit from the new Google Search app.
I support some non-speaking children who are learning to use the iPad as a communication device. Sometimes, these learners are inclined to ignore their communication apps because they much prefer other apps. These preferred apps may be worthwhile, but they can get in the way of learning to use the device for its primary purpose. In iOS 6, Apple has introduced an accessibility feature that can help.
Guided Accessmakes it possible to keep the iPad in a single app, and to control which features of an app are available to a user. In the case above, the iPad might be configured as a dedicated communication device until the user has learned to use it for effective communication. This is only one of many potential situations where Guided Access might be helpful. Of course, Guided Access is also available on the iPhone or iPod Touch.
As is the case with all accessibility features on Apple’s iOS devices, it must be turned on in the device ‘Settings’
Settings — General — Accessibility — Guided Access
The image below shows Guided Access in settings. Below that is a short video demo of Guided Access.