Assessing User Preferences in Programming Language Design
The design of new programming languages has primarily been guided by the preferences of a few (the authors of the language), rather than systematic study of the various options available. This is in part due to the fact that user studies to effectively test usability or understandability hypotheses are cumbersome and expensive. An interesting question is whether crowdsourcing techniques can be leveraged to improve this situation.
We explore this idea using a specific example. While the streaming data paradigm is a popular one for expressing parallelism within applications, there has been little consensus on the methods used to express streaming topologies. Here, we explore the use of Mechanical Turk to recruit self-described programmers as a community to assess user preferences and code readability for two techniques currently in use for the expression of streaming application topology.
The positive results of this study point to the idea that crowdsourcing techniques can be an effective technique that can inexpensively assist language developers in making good design choices.
Wed 25 OctDisplayed time zone: Tijuana, Baja California change
10:30 - 12:00 | Language DesignOnward! Papers at Regency B Chair(s): Zachary Tatlock University of Washington, Seattle | ||
10:30 30mTalk | Can We Crowdsource Language Design? Onward! Papers Preston Tunnell Wilson Brown University, Justin Pombrio Brown University, USA, Shriram Krishnamurthi Brown University, USA | ||
11:00 30mTalk | Assessing User Preferences in Programming Language Design Onward! Papers Roger Chamberlain Washington University in St. Louis | ||
11:30 30mTalk | Replacing Phrase Structure Grammar with Dependency Grammar in the Design and Implementation of Programming Languages Onward! Papers Friedrich Steimann Fernuniversität |