NearBooks

Purpose

Use ibeacons to find content within libraries and in this way increase library traffic.

Concept

This is an app that aids the user in finding relevant content depending on the area they are physically located in the library.

entrance.PNG
sci-fi.gif

When clicking on an item the app shows more details on that specific item, as well as suggestions on items that people have also shown interest in when getting the item.

 
ezgif-2-00f7a0f5a49e.gif
 

It also has a “queue” functionality where the user can mark a book as interesting to them, so they can find them later on when they might want to revisit it or read it. The other functionality is “history” where the user can check the books that they have previously read.

 
queue.gif
 

How it Works

With the use of proximity beacons from Estimote we were able to determine the content to be showed in the app. An area was determined for each beacon so when the smartphone got in range it shows the relevant content for each specific area.

beacon.png

Creation

I was involved in the ideation, definition, development, and evaluation stages for the iOS app. For both the iOS and the Android app, we sketched the wireframes and determined the functionalities and look & feel of the app.

 
sketch.gif
 

However this first iteration, specifically the functionality it was centered in (the ability to find nearby people that were reading the same book) was not as appreciated as we initially thought.

 
Captura de pantalla 2019-02-15 a la(s) 13.09.54.png
 

So we decided to focus in showing relevant content depending on the area the user was in. We kept the history and the queue functionalities.

To develop the look & feel we used Sketch combined with Illustrator CC. The programming was done in Swift programming in Xcode by another team member. Afterwards we used an iPhone 7 plus and an iPhone 5SE to do the testing of the app internally and used the same hardware for the user tests.

It is worth mentioning that the team developed also an Android version of the app, however I did not participate in its development nor evaluation.

Evaluation

I participated by performing the user tests that were made in two parts. First a usability testing paired with think-aloud protocol, where the users got to use the app and check its functionalities, and it also helped us get a more precise view of how the sensors behaved in a less controlled environment. The second part was an informal interview where the users were asked some questions to gather further insights in their thought process and general perception of the app.

 
User Testing.jpg