Last week we posted Open Referral’s 2015 Year in Review report. [You can view and download the document here.] We’ll unpack the key parts of the report here on the blog. This post will cover our technological accomplishments and feedback; the next post will cover our projects in the field; and a final post will consider the path ahead.
A world in which information about community resources is easy for anyone to find, trust, and effectively use — in whatever way works best for them. This is Open Referral’s hopeful vision of the future.
In 2015, we saw the first glimmers of such a world. Let’s take a look:
We published the Human Services Data Specification
Drafted by Sophia Parafina, through nine iterations of public comment and testing, the Human Services Data Specification (HSDS) is an open format that enables the exchange of machine-readable directory data about health, human, and social services among different kinds of information systems — including conventional calling center systems, emerging user-centered applications, and the major web platforms such as Google, Facebook, Yelp, etc.
HSDS v1.0 was published in March 2015, and is already under adoption by a range of referral software providers, from Purple Binder to iCarol. HSDS is even being adapted for the specific needs of particular subdomains, such as legal services (through the leadership of LegalServer) to emergency response services (through the leadership of Sarapis).
The Ohana Project delivered an open source directory platform
Ohana’s products include the Ohana API, which enables any resource directory to be transformed into an open platform, and the Ohana Web Search, a free open source mobile-friendly front-end application that enables simple searching for services.
Ohana has been redeployed by a number of initiatives, from the Boston Children’s Hospital to NYC:Prepared, to the SociSalud resource locator (launched by Medialab-Prado with data from Madrid’s open data portal).
We fostered an emerging ecosystem of open source tools
Taken together, Ohana and HSDS offer up the elemental building blocks of an open ecosystem of interoperable information systems. Over the course of the year, we saw that ecosystem start to take shape.
For example, Link-SF is a mobile resource locator initially developed by Zendesk for St Anthony’s in San Francisco, and is now being adapted to ‘speak’ the HSDS format and read from APIs such as Ohana. It has already been redeployed in other localities such as Queens.
In D.C., Social Impact Lab — in partnership with the DC Public Library — has prototyped a ‘Logic Library’ that enables users to build screening toolkits which can help people determine which services might be relevant and available to them.
Evaluating our progress
Open Referral is engaged in an iterative process of experimentation and discovery — like a laboratory. Over the course of this past year, we’ve gathered a considerable amount of feedback about what’s working and what can be improved.
HSDS: technically apt, with room to grow
The good: We’ve received positive feedback about HSDS from technical partners at enterprise software vendors (especially regarding HSDS’s approach to normalizing the relationship between services, sites, and organizational entities). Several vendors are using it as a model for structuring new products, in addition to facilitating data exchange.
Needs improvement: Feedback indicates that HSDS falls short of one of our initial principles of simplicity. The specification is daunting for relatively non-technical users (who, for example, shouldn’t be expected to be familiar with JSON datapackages, foreign keys, etc). For example, managing HSDS-compliant resource data in a spreadsheet would be prohibitively difficult unless additional tools are developed to ease the process. Additional issues regarding HSDS have been logged in the HSDS Github repo.
What we can change: In future iterations, we might consider a dual approach to formatting flatstructured data (a precedent that has already been set by the Open Contracting Data Standard). I’ve also proposed adopting a ‘Share Alike’ license that will encourage adaptation, while preserving the open source nature of new contributions. [Read more about this licensing proposal here.] Note that future development of HSDS is pending the re-establishment of technical leadership. In the meantime, we encourage users to adapt as needed, while documenting your work and reporting back to the community.
Ohana: transform any resource directory into an open platform
The good: We’ve been told that Ohana (the API and associated tools) is easy enough to redeploy and, for the most part, developers have approved of the code and documentation. Several sites have been able to be deployed with pre-existing data presented in new and more effective ways. In situations where resource directory data is already being diligently maintained, then Ohana can be useful in exposing that data to more tools and people.
Needs improvement: We have received ample feedback that the current version of Ohana doesn’t address the challenges of directory data maintenance. Ohana’s basic admin interface is challenging for users, especially those who are tasked with understanding and improving the quality of records which may be out-of-date, incomplete or otherwise inaccurate. Users have articulated the need for feedback loops from external apps to enable users to comment on data accuracy, as well as workflow tools to make record verification as easy as possible. Additional requests for features have been logged here.
What we can change: Ohana could evolve to meet these needs, or something new could emerge to take its place. This is up for us to decide. Like almost all open source civic tech projects, Ohana’s future will depend upon the value it brings to users. We invite those who have deployed Ohana and/or have experience with it to join us in this deliberation.
Evaluating our Process
The good: We conducted an iterative, public comment process that was driven by hands-on research in partnership with stakeholders in our pilot projects. This process received generally positive feedback, especially in dealing with key points of disagreement along the way. (Most prominently: approval of HSDS v1.0 hinged upon an architectural decision that ignited a month-long debate about data normalization vs denormalization. This debate underscored the tensions inherent in any effort to construct a useful model out of a complex reality, as well as technical tradeoffs between efficiency of systems and granularity of information, etc.) Our objective was not to ascertain The Truth so much as to ensure that we heard a range of perspectives, prioritized the interests of end users, and sought an outcome which would at least be acceptable to everyone involved.
Needs improvement: Quite a few participants observed that the workflow of conversations was confusingly fragmented between our community listserv, Google Docs, and Github; this sometimes made discussion around particular issues hard to follow. The commenting feature in Google Docs, in particular, proved unwieldy especially for managing complex discussions and for recalling previous conversations.
What we can change: We should be able to streamline the commenting process into a single collaboration tool (such as Github); however, we should also ensure that such a tool is made accessible to non-technical users, through custom interfaces and/or training.
We’re still learning!
Your feedback is essential to this process. Do you have insights to share?
Would you like to participate in the next iteration of the Open Referral workgroup?
Might you or your institution be interested in sponsoring this initiative?
Please be in touch.
Meanwhile, stay tuned for the next post — about various projects that are implementing these technologies.