Baby Yoda?? is probably the first thing that I think of as I look back at EPIQ 2020. It was a part of a message in Quorum Studio which is the integrated development environment for Quorum.
Quorum is an evidence based programming language. This means that user feedback is taken and incorporated before any feature addition. This is not a language that was crafted by someone in a lab. Yes, there is a lab involved but each feature has either a user request behind it and or has a user experience study behind it. This is perhaps why the language is so easy to use and what drew me to the language 4 years ago. At that time, I was building a chatbot and needed something accessible to create a user interface in. I wanted to use synthetic speech and quorum’s implementation was one of the easiest to code.
EPIQ or experience programming in quorum is the annual conference related to this language. This was an in person event and I used to routinely go green with envy as members of the quorum community would describe the fun they had. This time, the conference was conducted online.
It started with a fascinating key note by Dr. Jenna Gorlewicz about haptics and her work on touch interaction.
I did not realize how much you could do by varying the frequency of that single motor in my mobile phone. Quorum does have touch handling capabilities. I was also interested in learning about standards of haptic feedback and the extremely interesting work she was doing in using haptics to display graphs to the blind. This technology has huge potential because it is mainstream.
The session was interactive which means that I could ask whatever questions i wanted and did not fall a sleep.
Dr. Andreas Stefik next took the floor and outlined the development of quorum, the new scene editor, the licensing of quorum and a range of other things. By the end of his address, my fingers were itching to begin coding.
Every feature in quorum is designed with accessibility from the ground up. The scene editor is one of the best example’s of this. In other game engines, it is not easy to lay out characters if you are a screen reader user. Here, you have a nice tree view from where you select a character and drop it on to the grid. The coordinates of the grid are spoken as you navigate. If you are a programmer, this is similar to the visual basic tool box but even better because there, you had to check the properties of each control and do the math in your head.
Here, it is possible to get overlapping objects on your map and there is a mechanism in place to change the properties of each object.
We then broke into the advanced and learner sessions. There was no difference in content but a difference in speed.
In the first day’s session we delved further into the scene editor and made a map of our game’s world. There was a fascinating discussion we held on the different types of games such as role playing games, first person shooters etc.
The day ended all too soon.
Day 2 was as interesting. We dived into coding.
We had our map in place but now had to get our objects. We knew what the objective of our individual games was so had to decide what to do. We were working on a role playing game and had to setup our non-playing character who would tell us about our quest. Along the way, we continued learning about the physics system and movement. This is where I began to appreciate what I had learnt in high-school physics class. It was the practical application of the concept of velocity. Yes, in physics class, I was given examples and the teacher probably defined what it was but the application was clarified on day 2.
I had to use velocity to move the character in the game engine taking physics into account.
Day 3 was when we did some solid work around adding music and sound as well as completing our game’s objective. I think this is when we also were introduced to collision handling which is used everywhere in creating games. Mine was for my character to eat 21 mushrooms. Besides the coding, the facilitators encouraged us to learn the techniques relating to how something could be done. There was no one right way and many of us tried different approaches based on our various games.
Day 4 was the heaviest day in terms of brain cycles expended. We worked on functions such as finding the closest object to our player character. An overarching theme of our game design was accessibility. Just putting up our game universe, scenes and graphical players was insufficient. The idea was to make the game playable to as many people as was possible with varying abilities. Most of us incorporated screen reader support.
We also worked with the audio system and were able to add sounds to various game events like the collecting of weapons etc.
Day 5 was when I really felt the ease-of-use of quorum. I had completed my game and desired to add magnification to it. My wife is a user of magnification therefore this was a significant feature for me. In quorum, it took about 5 or 6 lines of code. In summary, I moved the camera with my player and used its setZoom function to allow for custom zoom values. Be warned, zoom values start from 1 which is the default. If you start with zero, you do not see anything on the screen.
On all 5 days, the sense of community was palpable and my compliments to the organizers for creating such a safe and vibrant group.
Day 5 was also when we were shown what quorum could do in a non-technical setting. The language is being used across schools in the USA and many of the teachers brought out how they had used LEGO to get children with multiple disabilities to emerge from their shells and begin interacting with their environment. Moreover, the language and the work also lead to a much better appreciation of blindness by sighted peers and FIRST® LEGO® League coaches. A really fascinating project was a LEGO robot that could throw an item across a table. The next one was the robot that swept a white cane and responded to obstacles in terms of finding its way around them.
The conference concluded with a stunning platform game demonstration by William Allee, one of the lead creator’s of quorum. He demonstrated a platform game with jumping, exceptionally creative use of the properties of the objects of the game engine and some terrific sound effects and parallax motion by using multiple scene files.
So, what else did I learn?
- Pre-conference conversations are as much fun online as offline.
- My parents were right to expose me to a wide variety of things and to give me the space to explore.
- There are insects colloquially called murder hornets. I believe their sting is one of the most painful.
- It is possible to not enable the physics engine but set objects to be collidible such that they respond to collision events.
- Debugging errors is heaps of fun if you do it with the right set of people.
- Accessibility works best when experienced so the lesson to all advocates of accessibility step away from presentations and use games.
- I can sit without moving for over 4 hours given the right level of interest.
- There are people in the world who ask good questions and from whom you learn
- Math is much more fun when applied and this can start from early school.
- My respect for researchers continues to deepen specially when I watch them collate multiple input and deal with conflicting priorities.
One of the questions we were asked frequently was “how EPIQ can be improved?”
It is a difficult one to answer because the online format was tried for the first time. I hope EPIQ 2121 adopts a hybrid model where those of us who cannot fly to the USA can also join. Hmm, how about a “destination EPIQ?”
[…] EPIQ 2020: beyond my wildest dreams […]