• Aucun résultat trouvé

TITLE: Better Service Design for Greater Civic Engagement 1. ABSTRACT

N/A
N/A
Protected

Academic year: 2021

Partager "TITLE: Better Service Design for Greater Civic Engagement 1. ABSTRACT"

Copied!
15
0
0

Texte intégral

(1)

TITLE: Better Service Design for Greater Civic Engagement 1. ABSTRACT

Generally, people have a good understanding of their local areas. Hence, encouraging them to share this tacit knowledge with local authorities, urban designers and city planners could help improve the quality of public space design significantly. However, persuading people to share their concerns/ideas about their areas, especially through a digital platform, presents a real challenge. One of the main barriers is a lack of trust in the public feedback system. Thus, this research investigates relationships between online trust and service design in order to provide a guideline on how to design the feedback system that addresses users’ practical and

emotional requirements. A mixed-methods approach was employed to identify key factors affecting online trust and their implications on service design. Six key factors affecting online trust were identified and combined to form a basis for service design guidelines. The

outcomes show that service design can support all components required to build trust.

Keywords: Service Design, Online Trust, Public Feedback System 2. INTRODUCTION

‘voiceYourview (VYV) – making public spaces safer’ is a collaboration project of five

universities, Brunel University, Coventry University, Lancaster University, the University of Manchester and the University of Sheffield. This multidisciplinary research is funded under the RCUK Digital Economy programme. The study aims to develop a novel real-time digital feedback system that allows people to voice their concerns regarding public spaces. By capturing actionable feedback and suggestions from end users, the quality of public space designs and the level of civic engagement could be improved.

Why should feedback be captured in real time? Imagine, on a beautiful Sunday, you stroll in a park nearby. After five minutes, you notice a pile of rubbish in front of you. You might want to make a complaint to your local authorities right away. However, you may be unsure who you should contact and the best way to formalize and submit your report. In most cases, you are quite likely to forget about it, since it could take time to accomplish the task. By providing a system that allows people to capture their thoughts in real-time (at the precise moment when they notice things that cause their concerns), the chance of people reporting problems and suggesting ideas on how to improve their neighbourhoods may increase.

The user trial conducted at a public library that had recently undergone a major refurbishment showed that public feedback could lead to better design outcomes, since the majority of feedback and suggestions were actionable (Whittle et al, 2010). Most comments directed at specific design features, e.g. book shelves and carpets. If these users were given opportunities to engage with the refurbishment process from the start, the design team might have had clearer ideas on how to improve. This result shows that an effective public feedback system has strong potential to support area regeneration projects from the early stage.

Evidently, the public feedback system should be inclusive and easy-to-use for all groups of people. The system should also be cost-effective, since it is not feasible to expect local authorities with limited public budgets to commit large amounts of investment and staff to make it work. Thus, the VYV system is designed to be fully automated – data will be

captured, processed, stored in an online repository and exchanged with suitable stakeholders,

e.g. urban designers and town planers. In this case, people can post comments, view other

comments, remove irrelevant/abusive content and prioritize actions to be taken. Similar to a

(2)

wiki website, the VYV system needs a high level user engagement. If people are not interested in using the system, no feedback can be captured. Hence, emotional connections and motivation are crucial to the success of the VYV system. Currently, ViewKI, the prototype online interface of the VYV system (see the description and screenshots on the VYV website: http://www.voiceyourview.com/site/content_viewki.php), have overcome usability issues. However, the emotional aspects still require further developments. Whilst urban designers and town planers may view the VYV concept as a feedback system, users perceive it as a public service and thus expect a degree of satisfaction that they receive from similar services. Hence, service design which governs all facets of the system and strongly influences user experience must be properly investigated and carefully developed.

3. BACKGROUND RESEARCH

The research, designed to identify a suitable service design strategy, was divided into two phases (see Figure 1). The scoping phase or background research was carried out to identify key issues and existing barriers preventing citizens from becoming more involved with their local communities. The scoping study, which comprised both theoretical exploration and a series of design ethnographic studies, revealed that accessibility and usability are considered

‘hygiene factors’ and cannot be used to motivate people to engage with the system (Chen, 2010). According to the scoping study, the majority of participants were not motivated to spend their valuable time reporting public space problems and were therefore unfamiliar with similar services, such as SeeClickFix (www.seeclickfix.com), FixMyStreet

(www.fixmystreet.com) and CitySourced (www.citysourced.com). Ethnographic research conducted with people from different age groups (ranging from 19 to 80 years old) and various cultural backgrounds (e.g. an Israeli designer, a Canadian programmer and a

Caribbean Accountant) pointed out that “trustworthiness” is the first step toward enhancing user engagement and relationships between users and the system. If people perceive the VYV system as ‘reliable’ and deserve their trust, they may decide to engage with the system.

Hence, the study of ‘trust’ in this case will focus on how people develop their perceptions and/or beliefs that someone or something is reliable and worthy of their confidence.

4. AIM & OBJECTIVES

As a result, this paper will discuss relationships between service design and trust. The main aim is to develop service design guidelines to enhance user trust in the VYV system.

This study includes four objectives as follows:

1. To discover relationships between service design and trust in an online environment 2. To identify service dimensions and design elements influencing trust in the VYV system 3. To gain user opinions on existing feedback systems to establish the current level of trust 4. To identify essential service design components for enhancing trust in the VYV system 5. LITERATURE REVIEW

Trust – It is commonly agreed that ‘trust’ is an abstract and a multi-faceted concept that is difficult to define. Hence, there is no single authoritarian definition for this term. This might be because trust has been studied extensively by various disciplines, e.g. psychology,

sociology and business management. One of the widely accepted definitions in the literature

on online trust is the work of Mayer, Davisand and Schoorman (1995), which described the

term as “the willingness of a party to be vulnerable to the actions of another party based on

the expectation that the other will perform a particular action important to the trustor,

irrespective of the ability to monitor or control that other party.” Another influential work in

the field of online trust is the work of Luhmann (1979) where trust is defined as a mechanism

(3)

to reduce complexity in order to cope with uncertainty. When dealing with a large number risks and aspects are not well understood, people need to base their decisions on trust.

Trust & Online Environment – Since the VYV system operates in the online environment, it is important to study online trust. Shneiderman (2000) pointed out that trust is one of key factors determining the success of most online entities, e.g. websites. An extensive literature review carried out by Beldad, Jong and Steehouder (2010) suggested that the level of trust of an online service initiative (either commercial or non-commercial) depends on the quality of service, the technology used to delivery service and the organization behind the initiative.

Wang and Emurian (2005) described key characteristics of online trust as follows:

1. Trustor and Trustee: In an online context, the object of trust is an online entity (e.g.

a website and its content) and its provider (an organisation). This means that users must trust not only a website, but also the providers behind the site. The feature makes the nature of online trust become even more complicated.

2. Vulnerability: Owing to the highly complicated online environment, it is harder to predict and control online providers’ behaviours. This characteristic makes users more vulnerable than offline situation. Two major risks (losing of money during the

transactions and private data being misused) are considered serious and therefore, make online trust become more critical for successful user engagement.

3. Individual Matters: Online trust is influenced by an individual’s perception and attitudes towards the Internet and/or technology. However, the level of trust can be increased or decreased based on the experience and satisfaction of previous online interaction. Jøsang, Ismail and Boyd (2007) argued that, in the initial phase of online interaction, extrinsic trust factors (such as the reputation of the service provider and ease of navigation) can significantly increase or decrease the level of trust.

Service Design – The term is defined as “the design of intangible experiences that reach people through many different touch-points, and that happen over time” (Moggridge, 2006).

According to Eckersley (2008) service developments involve design at three different levels:

1) strategic design planning (e.g. creating a business model for a new service), 2) design planning (e.g. developing ideas for new services) and 3) design implementation (e.g. turning service ideas into reality). Recent studies, such as the work of Sangiorgi (2010), argued that services should not be seen as an ‘end’ result, but could be considered as a ‘means’ for collaboration. This is because the production cannot be separated from the consumption. Both service providers and service users determine the outcome and the quality of the service.

Hence, VYV has potential to facilitate better collaboration between citizens and local authorities. Nevertheless, problems regarding user engagement and trust must be overcome.

Trust & Service Design – In order to establish online trust, Dayal, Landesberg and Zeisser (2003) proposed the trust pyramid (see Figure 2), which comprises six key components: state of the art security, merchant legitimacy, fulfilment, tone and ambient, customer control and customer collaboration. Evidently, reliable security measures and brand reputation are required to create the initial trust. However, these components are considered basic building blocks and cannot be used to differentiate the site or establish a long-term relationship.

Arguably, service design has the potential to address all components in the trust pyramid.

Good use of service design can satisfy/exceed user expectations; deliver an appealing

appearance; allow users to personalize the site to suit their needs and abilities; and promote

co-creation between the site and users. Moreover, service design can reduce negative

(4)

experience and deliver positive experience (Saco and Goncalves, 2008), which could increase user trust in the long run. Long (2004) suggested that the process of trust building begins with visual indicators and is continually reinforced through behaviours. Thus, he proposed three key dimensions of trust in e-business, which are also applicable for other online entities (see Table 1). A similar idea was suggested by Petrovic, Fallenböck and Kittl (2003), as they noted that trust building components can be broadly divided into three groups: the quality of information, reputation and usability.

Table 1: Dimensions of trust in e-business Trust Dimensions Relevance to e-business

Appearance Graphic design, information architecture, information design Behaviour Interface design

Reputation Past experience, experience of others, expression of brand Source: Long (2004)

Trust & Service Quality – According to Moritz (2005) services are intangible, complex experiences. Since the production and consumption happen at the same time, a service cannot be stored or owned. These distinctive characteristics make it difficult to control the overall quality, which in turn makes it hard to assure positive experience and trust. In this case, the service quality framework (SERVQUAL) proposed by Zelthaml, Parasuraman and Berry (1990), which was widely adopted in the service design community, was used as a basis for evaluating the service quality of existing feedback systems and investigating how design elements (e.g. layouts) might influence the perceived service quality from the user perspective. SERVQUAL provides a basis for many quality measurement models/tools, especially in the hospitality sector. The framework comprises five service dimensions:

1. Tangibles: Appearance of physical facilities, equipment, personnel and communications materials

2. Reliability: Ability to perform the promised service dependably and accurately 3. Responsiveness: Willingness to help customers and provide prompt service

4. Assurance: Knowledge and courtesy of employees and their ability to convey trust and confidence

5. Empathy: Caring, individualised attention the firm provides its customers

More recent concepts, such as the work of Stickdorn and Schneider (2013), also embraced similar components, namely (tangible) evidence and user-centred approach.

Trust & Public Sector – The understanding of relationships between trust and service design in the private sector appears to be more advanced than that in the public sector. Lipp (2003) observed that governmental sites are often perceived as “cold” and “impersonal” which make people feel like they being treated as a number that increases the resistance toward the systems. As a result, this research has an opportunity to bring about the better understanding of relationships between trust and service design in governmental sites, especially digital feedback platforms. This new knowledge could enhance trust as well as civic engagement.

6. METHODOLOGY

In order to properly investigate relationships between trust and service design, a mind map (Figure 3) was used to identify all key issues so that suitable research tools can be selected.

6.1. Case Studies

Purpose: This method was chosen to study good practice in the area of online services.

(5)

Criteria: Three service providers from both the private and public sectors were selected based on their outstanding performance of service delivery. The studies concentrated on how to establish trustworthiness through service design and how their principles/approaches could be applied to the digital feedback systems like the VYV. The three cases are:

Birmingham Own Health (http://birminghamownhealth.co.uk) – This innovative service was designed to support people with long term medical conditions in Birmingham by offering one-to-one advice and support in several languages. This case was selected because it showed how good service design can enhance user experience and build trust not only in healthcare services, but also the NHS.

Patient Opinion (http://www.patientopinion.org.uk) – This is an independent feedback platform for people to share their opinions regarding healthcare services freely. This platform is well-received, since it brings together user experiences and service measurement (Parker and Heapy, 2006). The open platform and transparent interaction have built trust not only on the website, but also healthcare practitioners.

Amazon.com – According to a consumer research carried out by Millward Brown (2007), Amazon was considered the top performing brand, especially in terms of user trust. Amazon has achieved the high level of trust due to its transparent transaction procedures, e.g. no hidden charge. The transparent order and delivery process and reliable security measures have helped ensure customer satisfaction and positive user experience which in turn has made the company the leading online brand.

Process: Secondary data were collected from various sources. Thematic analysis was used, as it excels in extracting key issues (themes) from qualitative data. All themes were compared to identify characteristics and trust building practices that all three cases had in common.

6.2. Expert Interviews

Purpose: The study aimed to gain experts’ opinions and suggestions on how service design could be used to enhance the user trust. Five expert interviews were conducted.

Criteria: Experts from industry and academia were chosen based on their work/research experience on key subjects, e.g. service design and user experience. These experts include:

 A strategic manager of a leading international electronic company

 A service designer of a leading service design consultancy in the UK

 A renowned academic in the field of service design and design management

 An eminent academic in the field of human-centred design and human factors

 A course director of the Interactive Media Design programme in a leading university Process: The semi-structured interview was chosen, since it ensures that all the key issues will be covered while allowing interviewees to discuss other matters that are important to them. While questions differed from one interviewee to another, most experts were asked to comment on existing feedback systems, identify key barriers that prevent people from

actively engaging with their local authorities, discuss key aspects of service design that could enhance trust and give practical suggestions for a system like VYV and SeeClickFlix. Their answers were recorded, transcribed and later analysed using Thematic Analysis.

6.3. Questionnaire Survey

Purpose: This study aimed to measure 1) the relationships between different service

dimensions and user trust and 2) the impacts of design elements on user trust

(6)

Sampling Criteria: Random sampling was employed in order to collect perceptions of the general public from different educational, professional and cultural backgrounds. 90

participants took part in this survey. 50% were male and 50% were female. Their ages range from under 25 to over 50 years old. The age groups were equally distributed.

Process: The questionnaire was designed based on the SERVQUAL framework and the Likert Scale. Thus, it included questions regarding five key service dimensions and user profile. The survey was conducted face-to-face to avoid any misinterpretation of question.

6.4. User Test

Purpose: The study comprising of several research tools was used to gain rich qualitative information regarding user perception and user trust. Two existing online feedback platforms:

SeeClickFix (http://www.seeclickfix.com/citizens) and Camden Council’s feedback form (http://www.camden.gov.uk/ccm/navigation/council-and-democracy/having-your-

say/complaints-and-suggestions) were chosen as test subjects for the user test.

Sampling Criteria: Experts suggested that four to six participants are suitable for a qualitative usability study (Covey, 2002). In this case, six participants from different educational and cultural backgrounds (ranging from a 24-year-old Chinese student to a 67-year-old retired engineer) were chosen in order to make sure that the VYV service is suitable for all groups.

Criteria: The feedback form from the Camden Council was chosen because this progressive council has begun to use strategic design as a means to engage local residents and investors.

SeeClickFix was selected due to its popularity. It year 2009, it covered 25,000 towns and 8,000 neighbourhoods, and the number of users continues to grow (SeeClickFix, 2009).

Process: Four stages of the users test were described as shown below.

Observation was chosen due to its ability to identify problems and real user behaviours (Bell, 2005). The participants were asked to explore Camden Council’s feedback page and SeeClickFix platform freely without a time limit. Their actions and facial expressions were carefully observed. The main purpose of this stage was to find out users’ reactions and check if their first impressions were positive or negative.

Think Aloud Protocol (TAP) was selected, since it excels in obtaining an insight into how people use their knowledge and previous experience, as well as investigating cognitive actions that people undertake while approaching tasks or tackling problems (Someren, Barnard and Sandberg, 1994). The participants were asked to perform a task of sending a comment via the two systems. The task was carried out in

participants’ homes/offices in order to ensure that input devices and computer settings would not affect their ability to perform the task. The task comprised three activities:

1) finding the feedback form, 2) entering comments, and 3) editing comments. The participants were requested to continuously describe what went through their minds as they performed the task. Their behaviours, verbal descriptions and facial expressions were recorded using a video camera, which enabled in-depth analyses afterwards.

Semi-structured Interviews were used to clarify certain issues identified during the observations and TAP, and check if the interpretations were correct. The participants were asked to 1) describe their overall experience, 2) report if they feel comfortable using the feedback systems, 3) self-assess their level of trust, 4) suggest how to enhance trustworthiness of both sites and 5) decide which system they preferred.

Reflective Questionnaire was employed to capture reflective thoughts on the entire

experience of reporting issues via two different feedback platforms. The participants

(7)

were asked to score both systems on seven aspects: 1) Interface Design, 2) Usability, 3) Organization, 4) Quality of Services, 5) Security, 6) Privacy and 7) Customer Relationship Management (see Table 2). After each participant completed the questionnaire, the total score was calculated to determine which system they prefer.

The outcomes were later compared with interview results.

Table 2: Examples of Questions (1= totally disagree and 5 = totally agree)

Factors Questions 1 2 3 4 5

Interface Design

The interface design is clear and professional

The appearance is in keeping with the services provided Usability The website is easy to navigate

The website is organized in a logical manner

Organization The organization appears to be legitimate & respectable The organization shows a great deal of respect for users Quality of

Services The complaint handling process appears to be transparent The information about its services is comprehensive Security The website explains its terms & conditions clearly

The website has adequately addressed users’ security concerns

Privacy The website makes users feel confident to leave personal data The website has adequately addressed users’ privacy concerns Relationship

Management

The progress of the complaint handling process can be traced The website offers a variety of channels to suit users’ needs 7. PRINCIPAL FINDINGS

7.1. Case Study Results

Although all cases are different, they share common practices which lead to their success:

 User-centred design approach – both visual and verbal communication; for example, choices of word used in the Patient Opinion website are very friendly and caring, e.g.

using “a bit about you and your story” instead of “personal information”

 Good quality services provided by professional staff; for instance, the success of the Birmingham OwnHealth website was down to a team of fully trained and highly experienced care managers who offer professional advice and consistent services

 Transparent process – no hidden information and users are fully informed. Amazon is considered one of the most trustworthy online brands due to its transparent process.

Amazon provides clear detailed information on subjects that concern people the most, e.g. ordering and returning policy, security and privacy and estimated delivery dates.

 Two-way conversations; for instance, in the Patient Opinion website, patients can share their experience regarding healthcare services and get responses from other patients and/or practitioners. Open dialogues between patients and healthcare professionals have helped strengthen their relationships which in turn build trust.

7.2. Expert Interview Results

All experts agreed that the key to enhance user trust is by demonstrating that people behind the system (e.g. local authorities) really care about them and their opinions. Appropriate ways of showing care and attention are demonstrated by courteous behaviour – for example:

 Making the feedback system fits people lives – giving as many channels as possible

so that users can use the means that suit their needs, e.g. text messages and emails

(8)

 Making the whole process transparent and intuitive/foolproof

 Always responding to user comments and exceeding their expectations

 Keeping users informed – letting users know how services will be performed

 Solving problems quickly if anything goes wrong

 Conveying compelling benefits and advantages of engaging with the system – convincing users that the VYV system is better than existing platforms

 Giving users a chance to contact a real person to reassure users that someone is responsible for dealing with their comments/suggestions.

One expert suggested that the digital feedback systems like the VYV may benefit from the

“Witnessed Presence and YUTPA framework” (Nevejan, 2009). The theory explains that trust building is a process routed in human perception. The presence of real persons

(witnesses) can make people feel at ease when involved in important matters, e.g. signing a contract. The same principle applies to digital communication – having real persons to overseas and take responsibility of the whole process could help reassure people. The YUTPA (being with You in Unity of Time, Place and Action) framework was developed to address four dimensions of trust building: time, place, action and relation. Achieving an appropriate balance of these four dimensions in the system design could enhance user trust.

7.3. Questionnaire Survey Results

The first objective of the questionnaire survey was to determine the importance of each service dimension on the trustworthiness of digital feedback systems like the VYV. It can be seen that all dimensions strongly affect the trustworthiness of the system (see Table 3). The results revealed that both the appearance and behaviour of the system are equally important.

Table 3: How strongly could these service factors influence your opinion regarding the trustworthiness of digital feedback systems? (1 = no influence; 5 = significant influence)

Service Dimensions 1 2 3 4 5

Responsiveness

1. Give prompt service to users 2 4 11 52 31

2. Always show a willingness to help users 1 1 21 44 31

3. Never too busy to respond to user requests 3 3 22 33 39

4. Tell users exactly when services will be performed 1 5 16 27 51 Tangibles

1. Have an up-to-date website 0 3 18 42 37

2. Have a visually appealing website 0 14 23 45 18

3. The appearance is in keeping with services provided 3 6 23 47 21

4. Have a neat and clear website design 0 7 20 49 24

Assurance

1. Make users feel safe when entering personal information 1 7 7 20 65

2. Be consistently courteous with users 2 3 31 43 21

3. Show the transparent process of dealing with information 1 3 19 43 34 4. Have knowledge to answer users’ questions/concerns 0 9 16 40 41 Reliability

1. Always honour the promise 1 1 12 41 45

2. Show sincere interest in solving users’ problems 1 4 10 42 43

3. Provide services at the time promised 1 2 9 41 47

Empathy

1. Give each user dedicated attention 2 1 31 45 21

(9)

2. Offer operating hours that are convenient for all users 2 2 23 46 27 3. Have staff who give users personal attention 1 6 26 46 21 4. Have flexible input/output channels that are convenient

for all users

2 3 22 53 20

Note: All values presented in the table are in percentages.

The second objective of the survey was to determine the impacts of design elements on the trustworthiness of digital feedback systems like the VYV. The participants were asked to decide whether particular design features affect the trustworthiness of the website. The results showed that the content and information design significantly affect trust (see Table 4 – 10).

Table 4: The impacts of security information on the trustworthiness

Question 1 Yes No

 When purchasing online, does the security policy, such as what system a company has in place to protect your account, influence your opinion regarding the trustworthiness of the website?

90% 10%

Note: Participants who answered ‘no’ explained that this kind of information is not clearly presented. Some customers may miss it. Besides, users do not understand how the security mechanism works.

Table 5: The impacts of the endorsement on the trustworthiness

Question 2 Yes No

 When purchasing a product/service from a website, does the endorsement of well-known brands in related fields influence your opinion regarding the trustworthiness of the website?

90% 10%

Note: Participants who answered ‘no’ perceived the endorsement as another form of advertising. In addition, this statement was not properly verified.

Table 6: The impacts of the customer feedback on the trustworthiness

Question 3 Yes No

 When purchasing services online, do feedback statements from previous customers and peer reviews influence your opinion regarding the trustworthiness of the website?

91% 9%

Note: Participants who answered ‘no’ explained that feedback was subjective. They did not know persons who wrote the comment. The reviews might be created by the service provider.

Table 7: The impacts of the use of a real image on the trustworthiness

Question 4 Yes No

 When visiting websites that allow people to report non-emergency issues (e.g. potholes), does a photograph taken at the scene where the problem occurs influence your opinion regarding the

trustworthiness of the website?

92% 8%

Note: Participants who answered ‘no’ commented that pictures were not the main aspect of the report. The trustworthiness depends on the quality/clarity of the reports.

Table 8: The impacts of the terms and conditions on the trustworthiness

Question 5 Yes No

 When purchasing services online, does the ‘terms and conditions’

statement influence your opinion regarding the trustworthiness of

86% 14%

(10)

the website?

Note: Participants who answered ‘no’ explained that they hardly read any policy. Some suggested that the website gave too much information. Moreover, some observed that every website has this section. Therefore, this standard announcement makes no difference.

Table 9: The impacts of the reassurance message on the trustworthiness

Question 6 Yes No

 When making a transaction, does the reassurance statement

influence your opinion regarding the trustworthiness of the website?

93% 7%

Note: Participants who answered ‘no’ explained that the claim was not properly verified.

Table 10: The impacts of the overall design on the trustworthiness

 When deciding to make a complaint regarding public space design online, which webpage design is considered more trustworthy?

 Option A: Camden Council’s feedback form (https://forms.camden.gov.uk/cus/servlet/ep.app?

ut=X&type=68748&auth=203)

 Option B: SeeClickFix (http://seeclickfix.com/chicago)

A (23%)

B (77%)

Note: For the minority of participants who chose Option A, it was because they were familiar with this style of feedback form and the interface. Thus, the design was perceived as simple and straightforward. In addition, some commented that this kind of form encourages more detailed explanation and showed that the local authority really care about people’s opinions.

In the last question, the majority of participants chose Option B due to various reasons:

 Firstly, most participants did not want to provide their personal information. Besides, they did not see valid reasons for requesting personal details.

 Secondly, the step-by-step reporting system helps people clarify their thoughts and makes the outcomes more specific and actionable.

 Many participants suggested that “visuals speak louder than words” – good graphic design works better than verbal descriptions. Including real pictures also gives a feeling of accuracy, reliability and sincerity

 Many participants like the map, since it makes reporting problems easier. The

location-based feedback system reassures many participants that “my opinion will not be just a piece of information”. They feel that the local authority sincerely wants to find out where the problem is and solve it.

 The interface design of Option B looks more professional and thus makes the site and the operating team seems more reliable. For most people, the simple layout in Option B makes the process sounds less hassle, more straightforward and easier to use.

7.4. User Test Results

This section summarises the key findings that emerge from a series of qualitative studies. The main conclusions were derived from the reflective questionnaires and interviews, since all issues identified during the observations and TAP were properly discussed during semi- structured interviews. All participants were able to send reports through both systems without any major problem although they encountered them for the first time. Hence, usability was not regarded a key issue. The user perception and experience were the main determinants.

The overall results revealed no clear preference (see Table 11). Two participants preferred

Camden Council’s feedback form; two participants chose the SeeClickFix platform; and

another two were indecisive. Interestingly, questionnaire results suggested that four out of six

(11)

preferred SeeClickFix platform. However, when discussing overall experience with participants, the majority preferred Camden Council’s feedback form.

Table 11: Comparison of Reflective Questionnaire and Interview Results Participan

t

No. Age

Total Scores for Camden Council’s

Feedback Form (A)

Total Scores for SeeClickFix Platform (B)

Reflective Questionnaire

Result Interview Result

1 24 59 90 B B

2 28 75 40 A A

3 40 96 84 A A

4 43 46 69 B B

5 67 79 84 B A

6 80 97 101 B A

Note: Total score for the reflective questionnaire is 115.

When analysing their comments carefully, it was observed that SeeClickFix platform have received more positive comments, e.g. make good use of visual communication, look professional, seem constantly updated (“look like a living system that is properly taken care of and regularly edited”), and sound easy to use. While Camden Council’s feedback form was praised in terms of accuracy, some participants found the form rather “overwhelming”

and this therefore discouraged them from using it. When discussing the level of trust, Participants no. 1 and 4 stated that they trust SeeClickFix platform because “it looks professional and takes me seriously.” Participants no. 3, 5 and 6 found both systems

trustworthy. While Camden Council’s feedback form is “clear and allows users to make all kinds of comment in one page”, SeeClickFix platform is “easy to read”, looks “very advanced and impressive” and has “a useful feature (the map).” Since there was no clear winner, they were asked to recommend how to improve the trustworthiness of both systems:

Camden Council’s Feedback Form should give an option for users to report their concerns without submitting their personal data. If personal data is needed, the system must assure users that it has reliable security measures and privacy protection in place.

SeeClickFix should avoid a ‘commercial’ look, since it may reduce sincerity and professionalism of the feedback system. Interestingly, this advice shows that certain design practices used in the private sector may not be suitable for the public sector.

8. DISCUSSIONS AND RECOMMENDATIONS

All the key issues affecting user trust in pubic feedback systems were identified and grouped using thematic analysis as shown in Figure 4:

Further explanation of the key themes and their implications to service design are as follows:

1. Individual Propensity to Trust: The survey results revealed that some people need more reassurance than others. While security and privacy policies assured most participants, some believed that this kind of claim was not reliable unless it “has been properly verified”. While most participants believed in peer review, some were very sceptical and argued that feedback might not be genuine. Experts recommended that an efficient way to persuade users with low trust tendency was getting them familiarised with the system and people behind it. The more they understand, the more likely they would trust.

2. User Knowledge of Service Provider: Online trust is difficult to build, since users have

limited channels to interact with real persons behind the system. The user test results

demonstrated that participants are likely to trust online services provided by organisations

(12)

that they are familiar with. Egger (2003) described this factor “pre-purchase knowledge”.

Hence, several interviewees suggested that getting people to know the VYV system is the first step toward building trust. Next, the system must continually reassure users by providing good experience. The ultimate goal is to turn users into brand ambassadors because ‘word-of-mouth’ is the most effective communication tool for building trust.

3. Organization Reputation: Reputation offers users a “basis for predictability of future capability and behaviour” (Doney and Cannon, 1997). Impacts of reputation are significant when people face a little known party, since it will influence the first impression. This phenomenon explains why, several participants chose Camden Council’s feedback system over SeeClickFix platform. Although the latter looks more professional, most people hardly know about it. In contrast every participant knows Camden Council. Therefore, it is important to raise the profile of the VYV system.

4. Interface Characteristics: Case studies and user tests revealed that desired characters for the VYV interface are: user-friendly, simple, intuitive and professional. These

characteristics can be achieved by easy-to-read fonts, neutral colours and simple layouts.

Many users suggested that useful design elements, such as maps and location pinpoint icons, should be included, since they help make comments more precise.

5. Information/Content: The user test and questionnaire survey results suggested that providing ‘sufficient’ information is the key to success. Too much information can overwhelm users, whilst insufficient data could give a negative impression. The quality of information is equally important. Constantly updated information gives a good

impression that the system is ‘alive’ and ‘effective’. The design of information is also crucial to the trust building process, since it affects users’ first impressions. According to user interviews, clean information design gives a good impression of accuracy, sincerity and reliability. Anonymity must be maintained throughout the whole process.

6. Service Quality: The survey showed that ‘reliability’ is the most important factor influencing user trust. Several researchers agreed with this result, as reliability directly affects user confidence. Experts suggested that reliability could be achieved by customer expectation management – e.g. keeping users informed throughout the process.

Table 12: Guideline for Enhancing User Trust in the VYV system Key Elements Service Design Guidelines

1. Individual

Propensity Being flexible & transparent

 Allow users to engage with the system in their own terms by providing multiple channels for sending and tracing their reports

 Keep the process transparent

 Two-way communication

 Raise the profile of the system 4. User

Knowledge of Service Provider

Being accountable

 Provide physical presence of the organisation, e.g. a postal address, a contact person and information of key staff

 Supply a brief history and the development of the organization

 Present affiliation information of the organisation 5. Organization

Reputation

Being officially recognized

 Provide information about key achievements, e.g. technology

 Present third-party endorsements and/or high profile partners

 Offer information that could reassure users, e.g. customer

feedback and industrial reviews

(13)

7. Interface

Characteristics Being professional

 Pay attention to small details

 Avoid ‘commercial’ appearance

 Give professional look

 Ensure accessibility and usability

 Offer a transparent process

 Inform users of procedures of the whole interaction process

 Allow users to customize the interaction to a certain extent, e.g.

choosing language 9. Information/

Content

Being protective

 Present security and privacy policies clearly and completely

 Maintain anonymity

 If personal data is required, valid reasons must be clearly explained and data must be treated as highly confidential

 Constantly update information to demonstrate the prompt response and effectiveness

 Provide clear descriptions of how the comments will be processed and used

12. Service Quality

Being courteous & reliable

 Give users immediate feedback or regular confirmations

 Allow users to track their reports in real time

 Simplify the interaction process

 Give an option for users to contact staff if necessary

 Inform users how the whole reporting system works

 Always honour promises and try to exceed user expectations By achieving all these service characteristics, the feedback system can become more

trustworthy, which is the first step towards building user relationships and encouraging active participation. Although this research was conducted to identify service design strategy for the VYV, this new knowledge could be generalized to be applied to other applications provided by the government. As a result, the pyramid of trust has been modified to illustrate how service design can support all components of trust building (Figure 5). While most service design frameworks often focused on the processes and methods (see the work of Stickdorn and Schneider (2013) and Moritz (2005) for examples), the practical guidance on how to enhance online trust and service quality through design is still lacking. These guidelines could assist service developers in planning online public services that are new to most

audiences, as they highlight and prioritise key factors that need to be taken into consideration.

9. CONCLUSION

Online trust was identified as the key factor to encourage people to get involved with digital

public feedback systems like the VYV. Subsequently, a series of qualitative and quantitative

research methods were employed to investigate how to successfully integrate trust into the

VYV system. After exploring relationships between online trust and service quality, and

relationships between online trust and design elements, six key factors were identified: 1)

individual propensity to trust, 2) user knowledge of service provider, 3) organization

reputation, 4) interface characteristics, 5) information/content and 6) service quality. Each

component was thoroughly investigated to create a complete service design guideline. This

new knowledge also helps advance the understanding of relationships between online trust

and service design in the context of a public feedback system.

(14)

10. REFERENCES

1. Beldad, A., Jong, M.D. and Steehouder, M. (2010) How shall I trust the faceless and the intangible? A literature review on the antecedents of online trust. Computers in Human Behavior, Vol. 26, No. 5, 857 – 869.

2. Bell, J. (2005) Doing your Research Project: A guide for first-time researchers in education, health and social science (4

th

edn). Glasgow: Open University Press.

3. Chen, Y.P. (2010) Inclusive Interaction Design Strategy for the VoiceYourView System.

Unpublished Research Report. Brunel University.

4. Covey, D.T. (2002) Usage and Usability Assessment: Library Practices and Concerns.

Washington: Digital Library Federation Council on Library and Information Resources.

5. Dayal, S., Landesberg, H. and Zeisser, M. (2003) How to build trust online. In O.

Petrovic, et al (eds.) Trust in the Network Economy. Austria: Springer, 89 – 95.

6. Doney, P. M., and Cannon, J. P. (1997) An examination of the nature of trust in buyer- seller relationships. Journal of Marketing, Vol.61, No.2, 35 – 51.

7. Eckersley, M.D. (2008) Designing Human-centred Services. Design Management Review, Winter 2008, 59 – 65.

8. Egger, F.N. (2003) From Interactions to Transactions: Designing the Trust. Experience for Business-to-Consumer Electronic Commerce. Unpublished PhD Thesis, Eindhoven University of Technology.

9. J

ø

sang, A., Ismail, R. and Boyd, C. (2007) A survey of trust and reputation systems for online service provision. Decision Support Systems. Vol. 43, No.2, 618 – 644

10. Lipp, P. (2003) On technical trust: An introduction. In O. Petrovic, et al (eds.) Trust in the Network Economy. Austria: Springer, 243 – 252.

11. Long, K. (2004) Customer Loyalty and Experience Design in e-business. Design Management Review. Vol. 15, No. 2, 60 – 67.

12. Luhmann, N. (1979). Trust and Power. Chichester: John Wiley.

13. Mayer, R.C., Davis, J.H. and Schoorman, D.F. (1995) An integrative model of organizational trust. Academic of Management Review. Vol. 20, No. 3, 709 – 34.

14.

Millward Brown (2007) Satisfaction and Trust in the State Services, [WWW] Millward Brown. Available from: http://www.ssc.govt.nz/display/document.asp?DocID=7087 [Accessed 30/07/10].

15. Moggridge, B. (2006) Designing Interactions. Cambridge: MIT Press.

16. Moritz, S. (2005) Service Design – Practical Access to an Evolving Field. Koln International School of Design.

17. Nevejan, C. (2009) Witnessed Presence and the YUTPA Framework. PsychNology Journal, Vol.7, No.1, 59 – 76.

18. Parker, S. and Heapy, J. (2006) The Journey to the Interface: How Public Service Design Can Connect Users to Reform, Demos.

19. Petrovic, O., Fallenböck, M. and Kittl, C. (2003) Paradigm Shift in the Network

Economy: From Security to Trust. In O. Petrovic, M. Fallenböck, M. Ksela and C. Kittl (eds.) Trust in the Network Economy Vol. 2. Vienna: Springer-Verlag.

20. Saco, R.M. and Goncalves, A.P. (2008) Service Design: An Appraisal, Design Management Review, Vol.19, No.1, 10 – 19.

21. Sangiorgi, D. (2010) Transformative Services and Transformation Design. In

Proceedings of the 2

nd

Nordic Conference on Service Design and Service Innovation,

December 1 – 3, 2010, Oslo, Norway.

(15)

22.

SeeClickFix (2009) 25,000 Towns Just Launched on SeeClickFix. [WWW] SeeClickFix.

Available from: http://seeclickfix.blogspot.com/2009/10/25000-towns-just-launched- on.html [Accessed: 03/02/12]

23. Shneiderman, B. (2000) Designing Trust into Online Experiences. Communications of the ACM, Vol.43, No. 12, 57 – 59.

24. Someren, M., Barnard, Y. and Sandberg, J. (1994) The Think Aloud Method: A practical guide to modeling cognitive processes. London: Academic Press.

25. Stickdorn, M. and Schneider, J. (eds.) (2013) This is Service Design Thinking.

Amsterdam: BIS Publisher.

26. Wang, Y.D. and Emurian, H.H. (2005) An Overview of Online Trust: Concepts, Elements and Implications. Computers in Human Behavior, Vol.21, No. 1, 105 – 125.

27. Whittle, J., Simm, W., Ferrario, M., Frankova, K., Garton, L., Woodcock, A., Nasa, B., Binner, J. and Ariyatum, B. (2010) VoiceYourView: Collecting Real-time Feedback on the Design of Public Spaces, In UbiComp Proceedings 2010, September 26 – 29, 2010, Copenhagen, Denmark.

28. Zelthaml, V.A, Parasuraman, A. and Berry, L.L. (1990) Delivering quality service:

Balancing Customer Perceptions and Expectations. New York: The Free Press.

Acknowledgement

The authors acknowledge the support of the RCUK voiceYourview project EP/H007237/1 Sandpit: voiceYourview – Making Public Places Safer

http://gov.epsrc.ac.uk/NGBOViewGrant.aspx?GrantRef=EP/H007237/1

The researcher would like to thank all participants for their kind co-operation, insightful

information and valuable suggestions.

Références

Documents relatifs

relevance of the result 5.86 Collaboration (Techies & Manager of competition on open data) 5.86 Networking dialogues between experts and users 5.85 Access to the content

Interviews and focus groups with people suffering from dementia and their caregivers (significant activities, current assistive devices, needs for future, etc.) by healthcare

Service design is a design field which addresses design challenges in a holistic way, taking into account the different stakeholders involved with the product, and utilizing

There is, despite the evidence that actual impact is limited, significant potential for digital technologies to have a positive impact upon democratic participation,

Let us also mention algorithms MEC and MCC, as defined by Shier [13]. MCC is a more general variant of MCS and MEC is a more general variant of MNS: the condition of maximality

Government funders take advantage of contracting-out social services to community organizations and their staff, because they recognize that such workers are dedicated to providing

• At current inventory levels, OFDA should consider to consolidate the inventory if they expect a larger scale risk

Hence, we model Galics and MoMaF execution time and output files size, and run the simulation using a grid middleware: Diet.. All the complexity of accessing resources,