Wednesday, July 27, 2016

Who will benefit from the online audio library - updated list

Based on the actual usage of the library, we are now releasing the updated list of people who can benefit from CLABIL. It is pure joy to see that more and more people come up to say that they could use this content.




  1. The visually impaired
  2. Rural children
  3. Underprivileged children in urban areas
  4. Girl and women children who are not sent to school due to gender issues
  5. All adults who cannot read or write and therefore cannot access knowledge on demand.
  6. Children of Indian origin who don’t know the Indian script and therefore the literature is also lost to them.
  7. Muscular Dystrophy
  8. Children with dyslexia or print related learning disability, who can use audio.
  9. Older people who lose sight as they age.
  10. Community Programs where books cannot reach or cannot be used due to low effective literacy. This content can be used on public broadcast or local radio as a means of change / information dissemination. (we have recently spoken with the change management team of Swachh Bharat Mission, who came up with this idea)
  11. The terminally ill who need to access content on demand.
Here's to More!!!

Tuesday, July 12, 2016

What is wrong with screenreader format of accessibility

Pre read:
http://www.heydonworks.com/article/responses-to-the-screen-reader-strategy-survey

I have never been fond of screenreaders. For a very intuitive reason - they try to ADAPT the user experience of the sighted for us. And that, you see, is not how user experience for the differently abled should be designed. Every user interface should be designed with that user in mind - First. The idea of adapting the user interface of one category of users to another category of users is prima facie counter intuitive.

With the mobile phones and touch phones, that problem becomes worse. A mobile phone, as we know, can become a real friend with an audio guide like Siri. Instead, my accessibility talkback feature first trains me on how to touch the various areas of the screen to reach the app i want.

From a UX perspective, thats all wrong. I don't need to know where the apps are on a screen. I just need to open the one i want. Which means i should be able to tell the device what i want, and the device should be able to meet that need.

We have created touch screen laptops for our sighted users. If that's possible, tell me again why complete voice navigation based devices appear so unimaginable.

Here, then, is UX Design 101, as applied to differently abled users that we work with:
1. Voice Based Recognition and authentication - this includes special training on local accent customisation.

2. Voice based program triggers - You do realise that actually, we only need a screen to show someone sighted our work? So the screen should not be the primary trigger for program activation, change and closure. The voice command or tactile buttons should be.

3. Intra Program actions - this can be website usage, using office or personal productivity software, or playing games. The intra program actions can easily be designed so that they are voice controlled. We should be able to review our work using playback.
I highly recommend the native built in talk back feature of MS Excel (not sure if its still there in the new MS Office)

4. Braille? Maybe: We may or may not know Braille. That's all.

Times Now TV - Esha's ONline library broadcast in 80 countries

Our online library, CLABIL and our Outreach program with the library were covered by Times Now as part of the news. This clip was broadcast in 80 countries around the world. Thank you for the support.
https://www.youtube.com/watch?v=ISl7Obc5wJk