Google project label made with Snagit.jpg

Google Voice Project Story Page

My team is a research support group

The Google Patent Search support team (the Team) is a group of customer support experts who assist Google patent researchers.  We help customers troubleshoot and format Google Patent searches.  The Google Voice Search project tested searching by voice as compared to searching by keyboard using participants using participants from the Team.  If the tests showed by searching by voice was faster, spoken searching would be recommended as a process improvement to the Team.   The following steps were taken to test searching by voice on Google.

I identified the research problem

Step 1:  Requests for assistance from my Team members were steadily increasing, but there was no funding to increase the size of the team. Therefore, it was necessary to find a way to speed up the Team’s ability to perform searches.  If using Google Voice Search was a faster way for the Team to perform searches, it could help the Team solve its capacity problem.  I was hopeful that Google would be the answer.

The whiteboard sketch of concept diagram below shows my thoughts on defining the problem. To summarize, it shows that the Patent Search Team limited time and members to perform complex searches that are lengthy and that contain complex terms.  

I listed the research questions and specified research methods

Step 2: The questions to be answered were: Do Team members take longer to key a search than to speak it?

.        Do Team members create longer, more complex searches when the search is spoken rather than keyed?   Do voice searches vary from keyboard searches?  

·         Does Google Voice Search retrieve the same number and type of patents that are retrieved in a keyboard search?

·         Do the results obtained from Google Voice Search vary from one Team member to another?

·         Are some search elements easier or more difficult using voice searching?

Considering the research questions to be answered, the best research methods for the Google Voice Search Project were field study, nondirected interview, and closed-ended questionnaire.  

I checked Team calendars and drew up a schedule

Step 3: I estimated that the research with Team members could be carried out in approximately two weeks.

·         First week: I planned to conduct field visits with Team members. 

o   Each Team Member would be given four research scenarios for which to formulate a search.  See the four research scenarios.  These scenarios are similar to actual customer scenarios.  For two of the scenarios, the search would be entered using Google Voice Search.  For the other two, the search would be entered by keyboard.   

o   The Team member would sort each set of search results by relevance and identify the top two most relevant patents. 

o   The Team member would copy and email the search history and two most relevant patents to me. 

·         Second week:  I planned to conduct closed-ended questionnaires, analyze test results, and create a report.    I knew this was an aggressive schedule, but we could do it if we stayed focused. 

The sketch below shows the Field Study Calendar that I first thought would work.  My thought process changed as various team members had to make changes to their schedules.  As noted below, three moves were made.  I started with a schedule in which I had only one field study interview in one day.  That all began to change when team members C and D both had to move their sessions to day 1 from days 3 and 4.   This change had a deeper effect on my thought process.  I began to fully realize that I was working with real, unpredictable human beings.   I would have to be psychologically ready for any comments they might make during the field studies.  I would need to stay completely calm and objective no matter what they said or did.   That realization turned out to be very advantageous.   During the field study, one of the participants became very angry because Google kept misinterpreting his voice search.    He shoved the table and stormed out of the room.  

I recruited test participants from the target audience

Step 4: The target audience consisted of researchers who have knowledge of patent documents as they are used in legal research. Since the Team Members were pre-filtered to have this knowledge, only a brief recruitment screener was needed.   

I created an outreach plan for contacting candidates

Step 5: The Team consisted of 15 possible candidates.  Due to the time constraints of the project, the goal was to obtain participation from approximately half of the group (seven or eight people.) I created an outreach plan to encourage participation.  The outreach steps took place during the five days of the business week prior to scheduling the research sessions. During days one through three, I contacted the candidates by in-person office visit or phone call.  During days four and five I scheduled the sessions.   The meetings and scheduling went very well.   It helps to be familiar with the people who are test candidates.

I planned the data collection work

Step 6: The sessions were planned as in-person field studies and non-directed interviews.   A follow-up closed-ended questionnaire was planned to send to the participants by email.   The in-person field studies were meetings that lasted one hour.  They took place in an office conference room.   Google search history screens were examined to obtain participants’ exact searches.  

The non-directed interviews consisted of five interview questions that obtained test participants’ subjective opinions about the test.   Three days after the session, the research participants were sent a closed-ended questionnaire through email.   Waiting three days gave the participants time to reflect on the test experience.  This questionnaire was needed to uncover additional information not previously obtained.  

I conducted the test sessions

Step 7: Participants were given the following instructions:

·         Given an overview of the written instructions that accompanied each scenario.

·         Told to “think out loud” as they worked on the test.

·         Given writing utensils to mark up or highlight the search scenarios (i.e., they could circle, highlight, or mark out words in the search scenario to help them focus on search terms they wanted to include or exclude)

·         Explanation that the PC was ready for searching Google Patents.  It was already on the Google Patent search form.  The participant did not have to spend time locating Google Patents. 

·         Told to try as many searches as they wanted to perform until they could locate at least one relevant patent in the search results.

I collected and analyzed the test data

Step 8: Due to unforeseen circumstances, three changes in test plans occurred:

·         Three participants were unable to complete the tests.  The actual testing included five participants.

·         Due to time constraints, scenarios were reduced to two. 

·         The relevancy of search results was to have been determined by the team’s patent attorney.  The attorney was not available for the test.  As substitute, the relevancy was determined by a team member with over 30 years’ experience in searching patents.  

Otherwise, the test proceeded as planned.  This was not a setback, but it did impress on me that major components of a plan sometimes have to be changed. 

Field studies:  The population was the total number of searches that would be run by all Google Search Team members.  The sample was the set of searches that were actually run by the test participants.  The results obtained from the test were used to make inferences about all possible searches run by Google Search Team members.   

Nondirected interviews:  The qualitative data obtained from the nondirected interviews for this project was analyzed using categorical data types of themes and patterns that can be analyzed statistically.   

The sketch below shows my basic thought process on collecting the data using field studies and non-directed interviews.  This type of sketch provides a quick way to explain how the project obtains its data. 

I reported research findings

Step 9: Field Study Observation Findings

During the field study, a scale was devised to rate participants’ reactions to the testing.  It included observations of tone of voice, facial expressions, and body movement.  See field study scale for reactions to test.  Participants were rated on a scale of 1 to 5 and observation comments were included in the findings.   Some participants were very positive, while others were negative.  One participant was so hostile, he refused to take the test.  

Nondirected Interview Findings

The following nondirected interview questions were asked immediately after the tests.  

·         Does using Google Voice Search in Google Patents seem like a faster or slower way to find relevant patents than using the keyboard?  If so, why?

·         Did you experience any physical difficulty using Google Voice Search?  If so, can you describe it?

·         Did anything about Google Voice Search surprise you?  If so, what was it?

·         Do you think you are likely to try Google Voice Search again when using Google search?  When would you use it?

·         Do you think there is a difference in relevance of search results between Google Voice Search and Google keyboard search?  If so, can you describe it?

The most significant finding was that 100% of the participants thought it was slower.  The theme that emerged was that Google Voice Search is, at best, only somewhat feasible for patent research.   

Closed-Ended Questionnaire Findings
The closed ended questionnaire revealed strong reluctance to use Google Voice search.   Only 50% of the participants thought it was easy to use and would save time as compared to keyboard searching.   See details of closed-ended questionnaire and answer choices.
 

Research Questions Findings

The data was analyzed to answer the research questions below:

1)      Do Team members take longer to key a search than to speak it?

The data shows that, generally it takes the same amount of time to speak a search as it takes to key it.  The chart below shows that most searches, both spoken and keyed, took one minute to enter.   Participant A’s searches were an anomaly.  This participant refused to take the Voice Search test.  Therefore, his time was recorded as 0 minutes.  When he keyed his searches, he spent a great deal of time entering them with quoted phrases.   Therefore, his keyed searches took 3.5 minutes to enter.

2)     Do Team members create longer, more complex searches when the search is spoken rather than keyed?   Do voice searches vary from keyboard searches?  

The number of words per search and number of characters per word were used as a measure of complexity and variation.   The data is skewed by some extreme measurements.  However, it shows that keying rather than speaking results in searches containing more words and characters. 

As noted previously, Participant A did not take the spoken search test; therefore, his results for this measurement are zero.   Participant B is an anomaly in that he performed only one search consisting of one nine-character word.  This participant was intimidated by Google searching, especially the spoken searching.  Average number of words per search is 2.85.  Average characters per word is 5.26.  These averages are skewed due to Participant A and B’s performance.

In keyed searches, average words per search is 4.72.   Average characters per word is 9.34.  Participant A is clearly an anomaly in his use of longer search terms.

3) Does Google Voice Search retrieve the same number and type of patents that are retrieved in a keyboard search?

The data shows that Google Voice Search is capable of retrieving the same number and type of relevant patents that are retrieved in a keyboard search.  However, there is a significant difference relative to patents identified by the participants.  The difference appears to be related to the scenario searched, not to whether the search is spoken or keyed.  See patents found in scenarios 1 and 2.

I reported recommendations for immediate and future action

Step 10: The results of this study lead to the following recommendations:

·         Since it is not known whether or not the test results follow a normal distribution, additional testing should be performed with different test participants.

·         Due to the largely negative reaction of the participants toward Google Voice Search, it should not be implemented in the Team’s workflow at this time.  Whether or not it retrieves relevant patents, it is not user friendly for the purpose of searching Google Patents.  The theme that emerged from cycle coding of the nondirected interview results was that Google Voice Search is not feasible for patent research. 

·         Additional research on enunciation should be performed with a broader group of Team members.   It may be possible to isolate the optimum conditions under which Google Voice Search will perform as well for any Team member as it did for Participant C in this study.

·         The data shows that Google Voice Search does retrieve the same number and type of patents that are retrieved in a keyboard search.  However, the difference in the number and type of patents identified by the participants was found to vary by scenario.   This is an issue for additional research.   If possible, a follow up group session with the participants to obtain their feedback will be held.   A discussion on how they analyzed and selected patents would be enlightening.

Learning Results

This project gave me a lot of experience with detailed planning and interacting with users.   I had to change plans and balance schedules, while continuing to drive toward completing the research goals.  It was a valuable experience. 

See the full report and additional information below.   Please contact me if you have questions.   

Download a PowerPoint presentation on the project.  

Download a PowerPoint presentation on the project.  

A story sketch appears below.   I greatly appreciate my team members!

See full project details including plans, screener questions, graphics, and interview responses. 

See full project details including plans, screener questions, graphics, and interview responses.