I am blind Spotify user. I am using PC with software called “screen reader”, which read me the text and help me to interact with the interface of different apps. There are millions screen reader users in the world.
The reason to bye “Equalify” was the ability to play Spotify from different than default sound card. Unfortunately to install and configure “Equalify” I need sighted help because the installer of the app and it’s settings are not accessible to my screen reader “NVDA”. It don’t read me the controls of the installer and settings. That makes my independent usage of the product impossible. Please fix accessibility of “Equalify”, I am sure that many other blind users will be thankful for that.
I am not sure how easy this can be implement but I don’t see any reason why it can’t be tried.
I support this. I took the gamble and bought the lifetime license. But actually it has been a waste of money because I just can’t use Equalify at all. I don’t know if the Equalify settings screen would work with a screen reader but the main problem is the button that is supposed to be visible on the Spotify screen is just invisible to a screen reader like NVDA. Perhaps even implementing a hotkey to bring up the Equalify settings would be a first step forward. Or provide a settings configuration utility we can run as a stand-alone executable.
My interest in Equalify was really to send sound to a different sound card but I just haven’t been able to do that.
Hope this is fixed.
Hey @profesora93 i’m sorry that i have missed your post until now.
(this is also a reply to @clansink46)
To be absolutely honest this is something that never even crossed my mind while developing the application.
This is pretty much a one man show and not having released apps like these before there is probably a lot of things i have overlooked.
I’m sure the installer issue could be fixed fairly easy by finding an install builder that works well with screen readers.
If you have any install builders or any apps whose installer you can use i’d appreciate if you could let me know. And i can try to make that more accessible.
The application itself is a whole different can of worms. It uses no standard components for its GUI.
The parametric part of the equalizer i do not have any clue on how to make accessible to the blind as it requires dragging bands on the X and Y axis with lots of additional settings…
Some hotkeys could probably be implemented to show the device changer , but i wouldn’t know how to communicate with the blind users what they are and how to use them.
If either of you wants to help make it more accessible and help test please let me know. I’m willing to give it a shot to try to make as much as possible work. But i will need help from you to make something that is as optimal as possible.
Hi there. I’ll give it a go. And yes just to be clear before we go much further, I am totally blind and I use a screen reader. My favourite one is the open source one, NVDA. I am also a software developer, though I work mostly with back-end services that don’t have a GUI.
It seems to me you have created Equalify on your own and probably without a lot of time to spend on it. Hopefully the following thoughts will help.
Regarding the installer, I understand the comment that it didn’t work with a screen reader. But actually, I have done an install and an update of Equalify myself and I did get it to work. I think however I really just had to accept the defaults, which was ok for me. So the installer experience could be a lot better.
As a developer, I haven’t had to create many installers but I have found one installer builder that is both easy to set up as a developer, and the installers it builds work properly with screen readers. It is the NSIS installer (Nullsoft Scriptable Install System), http://nsis.sourceforge.net. I just went there and I see they have a recent release which I haven’t tried. But hopefully they haven’t broken anything. Anyway you could spend an evening looking at it to see if it does what you want.
Now to Equalify itself. And here we might have a problem because I think you must be working with the Spotify SDK, which I think is officially depricated but obviously it still works. I was looking at developing my own add-on for Spotify but didn’t get around to it. But now I think I might be too late.
Anyway somehow Equalify creates a button on the Spotify screen, but it seems to be completely invisible to screen readers. That means there just doesn’t seem to be any way to activate it if you can’t actually get the mouse onto it and click it (I assume). So what are the options?
Does the SDK let you add something to the Spotify menu bar? The Spotify menu is fully accessible (which is a term that means it works with screen readers). If you could add Equalify settings to say the play menu, that would work and it wouldn’t impact at all on the visual button.
I can tell by the way Spotify interacts with the screen reader that it is presenting the screen as a document object, a bit like an HTML document in a browser. Are you able to add your button to that object in some way? So in HTML, for example, if you just create a button element that executes a Java function or whatever when it is clicked, if that button appears in the document object structure, a screen reader will find it. On its own, it might jut say “button”, but it is a simple matter to add a label or prompt to it so the screen reader could say something like “Equalify”. So the question is will the SDK let you create the Equalify button as part of the Spotify document object?
If that is not an option, then the question is how exactly are you creating this button. What language are you working in? Sometimes it is just a matter of what UI controls you import into your application. In the Equalify code, when you define the button, is there a label or or prompt you can set? If so, a screen reader may find it. Can you register it in the Windows hierarchy? If it is a child element of the main Spotify window and it has a label, then most likely a screen reader will find and identify it. Now if you don’t want a text label to actually display and interfere with the visual appearance of the button, you can use a trick like setting the colour to be exactly the same as the background so it won’t be seen, or set the actual display number of pixels to 0 or 1 perhaps. Often screen readers will read what is defined, and not just what is visible. Sometimes that can be a pain but sometimes you can turn it to your advantage.
Another option is to just add a hotkey to your button. Again it might depend on the language and development system you are using as to whether a hotkey can be attached to a button as you define it.
The final option is to provide a completely separate stand-alone utility that configures Equalify. You might not even release it other than to people who need it. Its job is just to create the same settings file as if it was saved by Equalify. So while it might not let us make adjustments while Equalify is actually running, it might still be sufficient to just make changes to the settings and save them so Equalify can read them in the usual way when it does run.
So now to the settings themselves. Remember I have no knowledge of these. Now the thing about screen readers is that if you build an app using say C++, ie Microsoft Visual Studio, and you just create a UI with editboxes, comboboxes, checkboxes, sliders and so on, chances are they will just work with a screen reader. It does mean making sure you add labels and/or prompts and so on so the screen reader knows what to say, and putting your controls into the right tab order so you can tab between them. But I think Visual Studio actually reminds you to do that. And this is the thing about accessibility. Most often programs are fully accessible when they start out, but they become inaccessible when developers try to create something with a particular look and feel that impacts on how the standard controls work.
I would think the choice of soundcard would just be a simple combobox. Most programs I know that can choose the soundcard work like that. So hopefully that is no problem.
But I take your point about the equaliser, where you say it relies on dragging bands on the X and Y axis. Now the well-known Audacity open source sound editing software has an equaliser option that is very accessible. So that is one example. I don’t know what it is like visually but I presume people like it. But it has standard controls so a blind person can just enter the values directly. The trick is to realise that behind the scenes an equaliser, whether multi-band or parametric, translates to a series of parameters. So for a multi-band equaliser you just specify the gain or attenuation (usually in dB) for each centre frequency. This can be done with a series of editboxes, or spinners. A parametric equaliser is a bit more complicated in that it depends on how many bands you need but for each band you specify like the centre frequency, the gain/attenuation, and something about the shape. Again, in the end, whatever the sighted person is doing on the screen visually, behind the scenes you are extracting and saving the data elements. So you just need standard controls to enter them. I’m not certain but I think Audacity has examples of both.
You might not want these controls to clutter the visual display. But one trick might be to just have one button that is visible that causes those controls to display or not display as the user wants. Or if it is a multi-tab UI, you could create a separate tab for that purpose.
Or maybe you could do it like Winamp. Again I don’t know about Winamp visually, but its equaliser is I think a ten-band eq which is adjusted by the top two rows of keys on the keyboard, and that’s all there is to it.
I hope that gives you something to think about. It’s not like we expect you to do weeks more work on this but you might actually enjoy the challenge and be a more knowledgable developer as a result. I’m certainly willing to test anything you come up with.
Hello, thank you for responding so quickly and in depth.
You have a lot of good points, i’ll address a few of the paragraphs first and add a few questions at the end of the reply.
I’ll give NSIS another shot once Equalify is somewhat accessible and i will most likely provide it as a separate installer on the download page.
Equalify does not use the Spotify SDK at all actually, it has all been made from scratch using basically win32 API and some tricks to be able to hook into the Spotify window and intercept the audio before its played. The Spotify SDK is not usable for anything like this. Adding an item to one of spotifys menus could possibly be done, but it would have to be a “hack” as there is no SDK/API to work with the Spotify window at all and it would likely create visual artifacts for those that can see it… But it is an option.
Quote: “Sometimes it is just a matter of what UI controls you import into your application.”
About the buttons and other objects, like i said in the post above, “It uses no standard components for its GUI”.
The button in the Spotify window is not a regular button, its 100% custom, as with all buttons, sliders, checkboxes and text in the EQ and settings. The only two elements that are standard components are the preset dropdown combo-box and the device select combo box in the settings.
This was done this way to be able to create the right look of the app and since i did not think of the blind when creating it, all controls and text in the window is invisible to text readers.
You are completely correct about how the Spotify GUI is made. It is basically a chromium window that fills the entire application window. The only “native” parts of the GUI is the title-bar and the menu-bar, everything else is a web app running in their own browser. There is no way for me to modify the web object in any reliable way.
The parametric equalizer should actually be somewhat usable already as long as i add a way to find the bands in the window. All bands has context menus that allows you to set all the settings(frequency, gain, Q(width) and filter type), and all the dialogs that it shows are standard, and works with NVDA (i’ve already begun testing).
Quote “I hope that gives you something to think about. It’s not like we expect you to do weeks more work on this but you might actually enjoy the challenge and be a more knowledgable developer as a result.”
The challenge and acquiring new knowledge is what drives me. I’d be more than happy to to give it my best to get this working. And as long as you are willing to help and test i’m sure we can get it working in no-time.
I’ve already made a new branch and tested out a couple of ideas (listed below).
Now to the questions:
Would making a builtin TTS option interfere with your other screen readers?
Making the application work with the other screen readers would be very hard from what i’ve seen, but adding my own speech output on mouse events - like hovering buttons or settings would be relatively easy. It would only be active for the Equalizer window and the Equalify button in the Spotify window of course.
I’ve already made a little test app that uses the standard Microsoft TTS engine that seems to work fine for me.
The only other option i see for making the current application somewhat screen reader friendly would be to add tool tips to everything, but those has a 1-2 sec delay when hovering items so that might not work…
I’ve tested Equalify with NVDA and it can not find anything to say in the EQ window except after the preset dropdown has been shown or the device changer combo-box when navigated to the right tab in the settings.
If i go with the builtin TTS i would make it ignore those two controls and let the regular screen reader handle those.
Edited the post above: missing: other.
The only other option i see for making the current application somewhat screen reader friendly would be to add tool tips to everything. . .
OK so you’re not using the SDK. So that suggests to me that even though it is designed to work with Spotify, Equalify is more like a stand-alone application.
I said earlier that applications will often work straightaway with screen readers if the right controls are used. You have said you are using your own buttons. NVDA uses APIs like Microsofts Active Accessibility and UI Automation to get information about what is on the screen and convey it to blind users. So even though you are using your own buttons, you could look at making your controls work with those APIs. I don’t know a lot about the internals of NVDA but I think it starts out by interrogating these APIs as well as the hierarchy of windows registered to the desktop. It uses this information to work out what control has focus at any time and convey current information about that control to the blind user. The user can either tab between the controls available in the focused application, or alternatively can use the screen reader’s object navigation keys to traverse the hierarchy of windows and controls to get to controls that can’t be tabbed to.
Your button to activate Equalify just seems to be invisible right now. So really that is the first problem to solve if we are to get anywhere at all. We need a way to run the Equalify settings screen.
You have said it would be a real “hack” to add it to the Spotify menu so let’s shelve that idea for now.
Another suggestion is to come up with a hotkey, like shift-ctrl-alt-e. . You might be able to just monitor keypress events to see when this or a similar hotkey is pressed, and just bring up the settings screen. I have worked with keypress events myself, although only with console applications, but instinctively I think that should be quite easy to do.
Another possibility is a simple separate executable that could be installed, called something like “Configure Equalify” which somehow signals the Equalify settings to come up. People could use that executable if they can’t use the Equalify button on the Spotify screen.
The important point is that there is not much else we can achieve until a blind person can actually get the settings dialogue to come up, one way or another.
Now turning to the settings screen itself, it sounds like you have already started to play with this using NVDA so well done for looking at that. I completely understand your comment about designing the controls to look and feel the way you want, without considering blind users. That, in a nutshell, is the very essence of why certain applications are just unusable by blind people. But you have also said all the context menus for the equaliser bands work already so it is just a matter of creating a way to select each band to play with. I guess only you can figure that out but all we really need is something we can tab or navigate to for each band, or a combobox or something to select which band to adjust.
You also said that the preset and device comboboxes are standard and I expect they will work with NvDA.
Now to answer your questions. You talked about creating your own speech in response to certain events. This kind of application is often referred to as self-voicing. Some applications do work this way but they are usually not desirable and really it is only done for certain rather specialised situations. The problem is usually that you end up with both the application and the screen reader talking at the same time, so you need to disable the screen reader. That can be done though, but then it means you are completely reliant on the application because effectively you have no screen reader in that situation. Another reason why this is not such a good option is that blind people don’t just use speech. A screen reader like NVDA can also drive a braille display, and for some blind people that is their preferred way of working. So the standard recommendation to developers is to try to follow the API rules so the screen reader works, and then let the screen reader take care of the interactions with a blind user. Also when you refer to creating your own speech in response to mouse events like hovering etc, don’t forget that blind people really don’t use the mouse at all. Some very advanced users know how to use the screen reader to move the mouse in a virtual sense, but again that would only be in certain very specific situations. So think keyboard. But I don’t want to completely discourage you from considering self-voicing.
However when you say “Making the application work with the other screen readers would be very hard from what i’ve seen”, I’m not clear what you mean. The main problem is still the Equalify button to bring up the settings screen. We can’t really get started without that. My suggestion is to think about my comments above and see if you can come up with a way to reliably start the settings screen. Then let me try that so we can see just what access issues remain. Yes thre may be a need for a new control to allow a blind person to choose the equaliser bands to work with but once that is done then perhaps your existing controls can be used for adjusting the equaliser. The other controls on the settings screen might already be ok.
I am off topic but I just want to say I am moved by you overcoming an obstacle like being blind, I can’t even image developing anything without my eyes not only that but you are so knowledgeable. I just wanted to say that and thanks for the motivation in life and all things.
I’ve not forgotten about this!
There has been some Spotify updates that i had to create some fixes to, and that took some time.
I have done some testing though, and there does not seem to be any way of adding screen reader support to the current GUI for Equalify pro.
I’ve tested with self-voicing - but that requires the use of a mouse to work as the controls i have do not listen to keyboard input at all. And adding keyboard support to them would be too much work for too little reward.
I’ve also tried with the accessibility APIs from windows. but those wont really work with the custom controls either.
And i’ve tested with the NVDA APIs…
None of those will be able to do whats needed without a massive rewrite of pretty much the whole application - which would take weeks…
I have an idea though.
What about a separate version of the EQ included with the normal one.
This would be a seperate window, that can be shown and hidden using a hotkey.
This new window would have normal windows controls for everything, that would work with screen readers.
For a start it could be possible to add a basic 5 or 10 band EQ and the device changer options.
It would be a simplified version of the main application but it should do all the basics.
Does that sound like a solution that would be acceptable ?
Sorry I’ve been busy on something else so have only found time today to think about this and reply.
Yes if it is a separate window then that’s better than nothing at all. The most important thing is the hotkey to show and hide it, or even just to show it because it could have a conventional close button.
Personally I’m not interested in the graphic eq though I’m sure some users would like that. It is the output device changer I really need.