In this study we present a SW prototype developed within the framework of a research project aiming at improving the usability of search engines for blind users who interact via screen reader and voice synthesizer. Following the eight specific guidelines we proposed for simplifying interaction with search engines using assistive technology, we redesigned Google user interfaces (i.e. simple search and result pages) by using XSL Transformations, Google APIs and PERL technologies. A remote test with 12 totally blind users was carried out in order to evaluate the proposed prototype. Collected results highlight ways in which Google interfaces could be modified in order to improve usability for the blind. In our demo we will show how interaction with the modified Google UIs is simplified and how the time for reaching the most important elements (i.e. first query result, next result page, etc.) is shortened in comparison to interaction with the original Google UIs. The demo uses the JAWS screen reader for announcing the UI contents.