Technology Readiness Site Survey

Issue Found

We know that simply providing a SolarCubed ICT Lab to less privileged schools does not ensure that the technology will be used in the way that we hope, or in a way that addresses educational issues. The educators at any given school must be interested in making fruitful use of technology prior to receiving the equipment. Regardless of how innovative and practical the technology is, the educators are the ones who will need to learn how to use it and then use it in their teaching: they are the ones capable of ensuring that its potential impact is being maximized.

Mission, Challenge, and Solution

With the intention of maximizing the potential that the technology will be adopted and prove useful in an educational context, our goal was to develop a logical and reproducible method of evaluating a school’s “readiness” for technology. Our system must be able to interpret macro data and determine the best candidates for the SolarCubed ICT Lab.

The challenge in this is to be able to assess which schools would be considered “ready” for the technology, in an efficient and economical matter. We determined that a well designed survey, which could be carried out by locals, would be able to capture both the school’s infrastructural capability for using and sustaining such technology, and the educators’ attitudes towards and capacities for using such technology in the school.

Creating the Survey

Production of the survey was divided into three phases:

1. Forming suitable questions

This phase consisted of creating a pool of questions that our team felt would best capture an image of the school. Our survey was designed to be broad in nature so that other teams could utilize the survey in different regions of the world. However, we did focus on filtering out as many questions as possible without losing the integrity of the survey in order to reduce the time required to take the survey. After trial and error we found that the most effective way to give the survey is in two parts. One part collects information about the structure and demographics of the school while the second part is aimed to determine the educators’ attitudes toward and experience with technology. Ideally Part I would be taken by a school administrator/principal, whereas Part II is directed towards school teachers.

survey explanation graphic

2. Developing the software

The team explored options and settled on ODK Collect as the best software to be used for this survey. ODK provided us with an open source, open access product that was capable of creating and distributing surveys that also allows us to view the data in an organized fashion. Through the ODK app, available on any Android device, you can download and save the survey to the device thus enabling you to take the survey offline and upload the results when data connection becomes available.


3. Testing the survey

Before launching, our team was able to administer the survey to multiple test groups, both on our campus and around the world, through students’ and colleagues’ international connections. The test group gave us insightful feedback on which questions they felt were negligible or confusing, the survey interface, usability, and time required. All suggestions were taken into consideration and a few changes were made.



Training Survey Administrators

Training survey administrators was a vital component of our project. The team put together an ODK Collect Help Page, as seen below (also, find the link to the actual document below), to give administrators a quick overview of the application itself. We also held a Skype call with the administrators and went question-by-question through the survey to help them gain familiarity with the survey as well as to answer any questions they may have had about the survey design.

Screenshot (97)

Working with Chuuk

Although we focused time and effort on training our survey administrators in Chuuk, we did face some challenges. Our partners were often unable to be reached for days at a time and we were unable to move forward with data analysis on our end since we were receiving no new data from the administrators. Training over a long-distance connection with no real possibility for face-to-face communication (due to bandwidth limitations) proved frustrating as well, as we were not certain that our training was successful. Although these circumstances were frustrating to our team, they provided a good example of real challenges that teams face when working with partners halfway across the world, and we share them here to help inform future projects of the greatest areas of struggle–the human side of technology projects!

Ultimately, the survey administrators were able to fully survey 5 schools, but these schools were all located on the main island: all had electricity and some already had computer labs. These were not the type of rural, un-connected, lagoon schools we were hoping to have had surveyed. This is a challenge that should be better addressed during training in the future.


Gathering Data

Our team was able to download the data from the ODK server as it was uploaded by our survey administrators. The raw data was easy to read and convert to an Excel spreadsheet and was ultimately uploading into a Google Document for shared online access. However, at this point our team did encounter two issues with the use of the ODK application.

  1. Server capacity –
    ODK allows free use of their servers, however, it restricts the amount of Internet traffic and downloads utilized by applications running on free accounts with quotas. This means that only a limited number of files with a specific file size can be downloaded per day. The refresh time is only 24 hours and was only a minor annoyance as we downloaded the files in .xls format.
    However, this quota cap may be a problem if the cap is reached only after downloading three or less files, or in the middle of downloading one file. When deciding to use the ODK Aggregate on Google App Engine servers a future team may decide that the free server on ODK is too inconvenient for large scale batch analysis of schools, and utilize an alternate data application/server such as FormHub or Fulcrum. (Our team decided that in a future iteration of the survey, we would certainly pay for a greater capacity of server space and capabilities.)
  2. Downloading pictures from the survey –
    Because of the length of the surveys and amount of pictures they contained, there was no way to automate the download of pictures from our server instance of ODK without losing track of what each image corresponded to.  To accomplish this, each picture was selected from the survey and saved locally to a corresponding folder with an appropriate file name in order to identify what it pertained to. In all, 89 total pictures were received from both surveys, with instances of survey submissions containing as little as 1 and as many as 25 pictures. This was an unfortunate restriction, since the pictures added such richness to our survey: As they say, a picture is worth 1,000 words!

Analyzing Data

After the data has been downloaded from the ODK servers the team was now faced with how to comparatively analyze each data set from the individual schools surveyed. By developing a series of algorithms within excel, we were able to create a numerical definition for each question in the survey. Starting with a simple point system we were able to assign a multiplier of + or – 10% for key questions answered. This system is intended to be developed along with the survey to allow for more pertinent questions to carry a higher impact in the resultant calculation.

By structuring this set of equations at the bottom of the excel document, any user with access to the master formula can easily copy the set over into the latest data downloaded from the ODK server, allowing for easy implementation. The end goal for this process is not to simplify the surveyed schools into a pass/fail criteria, but to develop a “short list” of schools that would benefit the most from the resources at hand.

Below are a few images of the beta test excel formulas-

Data Analysis -Working(Tony).xlsx

Find out More

Find our Survey on the following two web pages:

SiteAssessment V1.5 Part 1:

SiteAssessment V1.5 Part 2:

Download our ODK Collect How-To/Help Page

Download our team’s Final Report

To find out more on using our survey for your project please contact [email protected]