A Non-Fungible Token, usually referred to by its acronym NFT, uses technology that involves data on a blockchain that cannot be changed after they have been added. Therefore, while they share similar blockchain technology with cryptocurrencies, the functionality is different.
NFT’s functionality enables them to be used to prove ownership of an intangible-digital, or tangible-physical, asset, and the associated rights the owner has.
The most popular practical application of NFTs for digital assets is proving ownership of digital art, virtual items in computer games, and music.
The unique features of NFTs are becoming increasingly appealing as we spend more of our time online. Despite this increased popularity there is a lack of clarity over the final form this digital asset will take. The purchasing process in particular needs to be clarified.
This research developed a model of the purchasing process of NFTs and the role of trust in this process. The model identified that the purchasing process of NFTs has four stages and each stage requires trust.
You see here in the figure, the four stages in the purchasing process on the left, and the trust required in each of these stages along the center. Finally, on the right you see that trust in all four stages leads to trust in an NFT purchase.

Figure 1. Model of consumer trust at each stage of the NFT purchasing process

The four stages of the purchase are: First, set up a cryptocurrency wallet to pay for the NFT, and to be able to receive it. Second purchase cryptocurrency with the cryptocurrency wallet, third use the cryptocurrency wallet to pay for an NFT on an NFT marketplace and finally, there is the fourth, after sales service that may involve returns, or some other form of support.
The model that is supported by our analysis identified four stages to trust: First trust in the cryptocurrency wallet, second trust in the cryptocurrency purchase, third trust in the NFT marketplace, and fourth trust in after-sales services and resolving disputes.

Reference:
Zarifis, A. & Castro, L.A. (2022) ‘The NFT purchasing process and the challenges to trust at each stage’, Sustainability, vol.14, no.24:16482, pp.1-13. Available from (open access): https://doi.org/10.3390/su142416482

New research!

Central Bank Digital Currencies (CBDC) are digital money issued, and backed, by a central bank. Consumer trust can encourage or discourage the adoption of this currency, which is also a payment system and a technology. CBDCs are an important part of the new Fintech solutions disrupting finance, but also more generally society. This research attempts to understand consumer trust in CBDCs so that the development and adoption stages are more effective, and satisfying, for all the stakeholders. This research verified the importance of trust in CBDC adoption, and developed a model of how trust in a CBDC is built (Zarifis & Cheng 2023).

Figure 1. Model of how trust in a Central Bank Digital Currencies (CBDC) is built in six ways

There are six ways to build trust in CBDCs. These are: (1) Trust in government and central bank issuing the CBDC, (2) expressed guarantees for the user, (3) the positive reputation of existing CBDCs active elsewhere, (4) the automation and reduced human involvement achieved by a CBDC technology, (5) the trust building functionality of a CBDC wallet app, and (6) privacy features of the CBDC wallet app and back-end processes such as anonymity. The first three trust building methods relate to trust in the institutions involved, while the final three relate to trust in the technology used. Trust in the technology is like the walls of a new building and institutional trust is like the buttresses that support it.

This research has practical implications for the various stakeholders involved in implementing and operating a CBDC but also the stakeholders in the ecosystem using CBDCs. The stakeholders involved in delivering and operating CBDCs such as governments, central banks, regulators, retail banks and technology providers can apply the six trust building approaches so that the consumer trusts a CBDC and adopts it.

Dr Alex Zarifis

Reference

Zarifis A. & Cheng X. (2023) ‘The six ways to build trust and reduce privacy concern in a Central Bank Digital Currency (CBDC)’. In Zarifis A., Ktoridou D., Efthymiou L. & Cheng X. (ed.) Business digital transformation: Selected cases from industry leaders, London: Palgrave Macmillan, pp.115-138. https://doi.org/10.1007/978-3-031-33665-2_6

New research!

Fintech is changing the services to consumers, and their relationship with the organizations that offer them. This change is neither top-down nor bottom-up, but is being driven by many different stakeholders in many different parts of the world, making it hard to predict its final form. This research identifies five business models of Fintech that are ideal for AI adoption, growth and building trust (Zarifis & Cheng, 2023).

The five models of Fintech are (a) an existing financial organization disaggregating and focusing on one part of the supply chain, (b) an existing financial organization utilizing AI in the current processes without changing the business model, (c) an existing financial organization, an incumbent, extending their model to utilize AI and access new customers and data, (d) a startup finance disruptor only getting involved in finance, and finally (e) a tech company disruptor adding finance to their portfolio of services.

Figure 1. The five Fintech business models that are optimised for AI

The five Fintech business models give an organization five proven routes to AI adoption and growth. Trust is not always built at the same point in the value chain, or by the same type of organization. The trust building should usually happen where the customers are attracted and on-boarded. This means that while a traditional financial organization must build trust in their financial services, a tech focused organization builds trust when the customers are attracted to other services.

This research also finds support that for all Fintech models the way trust is built, should be part of the business model. Trust is often not covered at the level of the business model and left to operation managers to handle, but for the complex ad-hoc relationships in Fintech ecosystems this should be resolved before Fintech companies start trying to interlink their processes.

Alex Zarifis

Reference

Zarifis A. & Cheng X. (2023) ‘The five emerging business models of Fintech for AI adoption, growth and building trust’. In Zarifis A., Ktoridou D., Efthymiou L. & Cheng X. (ed.) Business digital transformation: Selected cases from industry leaders, London: Palgrave Macmillan, pp.73-97. https://doi.org/10.1007/978-3-031-33665-2_4

Mobile apps utilize the features of a mobile device to offer an ever-growing range of functionalities. These apps access our personal data, utilizing both the sensors on the device, and big data from several sources. Nowadays, Artificial Intelligence (AI) is enhancing the ability to utilize more data, and gain deeper insight. This increase in access and utilization of personal information offers benefits, but also challenges to trust. The reason we are re-evaluating trust in this scenario, is because we need to re-calibrate for the increasing role of AI.

This research explores the role of trust, from the consumer’s perspective, when purchasing mobile apps with enhanced AI. Models of trust from e-commerce are adapted to this specific context. The model developed was tested, and the results support it.

Figure 1. Consumer trust and privacy concerns in mobile apps with enhanced AI

The intention to use the mobile app is impacted by (1) propensity to trust, (2) institution-based trust, (3) trust in the mobile app, and (4) the perceived sensitivity of personal information, are found to impact

The first three of those four, are broken down further into their constituent parts. (1) Propensity to trust is based on a person’s (1a) trusting stance in general, and (1b) their general faith in technology. (2) Institution-based trust is strengthened firstly by (2a) structural assurance and (2b) situational normality. Structural assurance of the internet includes guarantees, regulation, promises and related laws. The users evaluation of situational normality can be formed by app reviews. Out of the whole model the institution based factors are the weakest.

Trust in the mobile app (3) is more complex, it is based on five variables. These are (3a) trust in vendor, (3b) trust in app functionality, (3c) trust in genuineness of app, (3d) how human the technology appears to be, and (3e) trust in personal data use.

Those are the main findings of this research. The model is helpful because it can guide the stakeholders involved in mobile apps in how to build trust. By using the model they can identify what they need to communicate better, and what they need to change in the apps, or somewhere else in the ecosystem.

Reference

Zarifis A. & Fu S. (2023) ‘Re-evaluating trust and privacy concern when purchasing a mobile app: Re-calibrating for the increasing role of Artificial Intelligence’, Digital, vol.3, no.4, pp.286-299. Available from (open access): https://doi.org/10.3390/digital3040018

#trust #information privacy #artificial intelligence #mobile commerce #mobile apps #big data

Dr Alex Zarifis

My new research developed a model of trust in making payments with the Ethereum (Zarifis, 2023). I published the first peer reviewed research on trust in payments with Bitcoin in 2014 (Zarifis et al. 2014), and I wanted to apply my experience from that to understanding the consumer’s perspective to making Ethereum payments.

Ethereum is being utilised in various ways, including smart contracts and payments. Despite some similarities with Bitcoin, Ethereum is a different technology, with different governance and support.

Ethereum payments require digital wallets and the process is different to paying in traditional fiat currencies like the Euro. When a person wants to take an action without controlling all the parameters, and some risk is unavoidable, trust is necessary.

Figure 1. Model of trust in making Ethereum payments, TRUSTEP

The model demystifies how trust is built in consumer payments with Ethereum. The model starts with the individual’s predisposition and then covers the factors from the specific context of Ethereum payments. From the person’s individual characteristics, their willingness to innovate in finance and technology have a role. There are then five variables from the contexts: Adoption and reputation, stable value and low transaction fees, effective regulation, payment intermediaries and trust in the seller. The personal and contextual factors together influence trust in the Ethereum payment process and making a payment with Ether.

While the model has similarities to previous models of trust, such as the role of each individual’s psychological predisposition and the role of reputation, the role of institutions such as regulators and the importance of trust in the retailer, the distinct characteristics of Ethereum also play a role. In fact, the factors related to the distinct characteristics of Ethereum have the strongest support based on the average of the responses. This research can be added to a growing body of research in trust that illustrates how users’ beliefs in each cryptocurrency need to be explored separately.

Furthermore, the role of the organizations involved in the payment process are shown. While trust in the retailer is usually a factor in retail payments, the regulators and payment intermediaries are not always a significant factor, so it is a useful contribution to show that this is the case here.

That is what I want to share with you here. If you have experiences related to what I am talking about, please let me know, I would love to hear from you.

Reference

Zarifis A. (2023) ‘A Model of Trust in Ethereum Token ‘Ether’ Payments, TRUSTEP’, Businesses, vol.3, no. 4: pp.534-547. Available from (open access): https://doi.org/10.3390/businesses3040033

Zarifis A., Efthymiou L., Cheng X. & Demetriou S. (2014) ‘Consumer trust in digital currency enabled transactions’, Lecture Notes in Business Information Processing-Springer, vol.183, pp.241-254. Available from: http://link.springer.com/chapter/10.1007/978-3-319-11460-6_21#

Dr Alex Zarifis

Collaborative Consumption (CC) and the sharing economy, where consumers do not purchase a product or service, but share it, is growing in popularity. This is due to a trend away from ownership towards experiencing. The first two areas of the economy that this business model disrupted were fare sharing and renting rooms for short periods. Other areas are also influenced but it is unclear which sectors of the economy will be disrupted next. Smaller niches of the economy, or areas where more public-sector involvement is necessary, such as the elderly and the disabled may not be at the forefront and may be the laggards losing out on possible benefits for years.

This research evaluates the current CC business models and identifies 13 ways they add value from the consumer’s perspective. This research further explores whether CC business models fall into two categories in terms of what the consumer values. In the first category, they require a low level of trust while in the second category a higher level of trust is necessary. Our survey evaluates whether there was a difference between CC business models that require a low level of trust such as a taxi service and those that required a high level of trust such as supporting the elderly and disabled.

Figure 1. Comparative spider diagram of value added by collaborative consumption business models for low and high required trust

The analysis verified that the consumer requires 13 types of value added from the business model which can be separated into three categories which are personal interest, communal interest and trust building. It is important for organizations to acknowledge how they relate to these dimensions.

It was found that CC business models can be separated into those that require a relatively low level of trust such as fare sharing and those that require a high level of trust such as supporting the elderly and disabled, as we can see in the figure here. For the business models that only require low trust, the consumer considered the personal interest value added more important, while in the those requiring more trust the consumer rated the value added of trust building higher.

The findings suggest that changing CC business model from one that requires low trust to one that requires higher trust necessitates a significant improvement in how the organisation builds trust. This can be considered a ‘step’ change in trust-building which would have to be a consideration at business model level. Iterative improvements at operational level may not increase trust sufficiently.

Reference

Zarifis A., Cheng X. & Kroenung J. (2019). Collaborative consumption for low and high trust requiring business models: From fare sharing to supporting the elderly and disabled, International Journal of Electronic Business, vol.15, no.1, pp.1-20. Available from (open access): https://www.inderscienceonline.com/doi/abs/10.1504/IJEB.2019.099059

Dr Alex Zarifis


Have you made a purchase from a three dimensional Virtual World (VW)? Probably not, only a small minority have. When VWs first became popular fifteen years ago, people jumped to the conclusion that they were the future, the new platform to socialise online. Their adoption however did not end up being exponential. So why do the experts often think VWs, with their additional functionality are the future, but that future has not come yet? We decided to ask the consumer.
There is a degree of understanding on what each channel can offer but the relative advantage of each channel in relation to the others is less understood. By relative advantage we mean something the one channel, for example three dimensional VWs, have an advantage over two dimensional, traditional, websites. This research, evaluates the relative advantage between the channels of three-dimensional VWs, two-dimensional websites, and offline retail shops. The consumer’s preferences across the three channels, were distinguished across six relative advantages.

Figure 1 The three channels and six relative advantages in multichannel retail
In the figure, you can see at the top the six different relative advantages, and beneath them, how the three different channels perform, in relation to these relative advantages. Participants, showed a preference for offline and 2D websites, in most situations apart from enjoyment, entertainment, sociable shopping, the ability to reinvent yourself, convenience and institutional trust where the VWs were preferred.
We can look in more detail at the fifth relative advantage, that VWs have higher institutional trust compared to 2D websites. Consumers value the role of the VW as an institution in relation to trust. One feature that is appreciated is that the buyer does not receive your banking details. Some participants value the role of the VWs administration in identifying and warning about specific threats.
The findings illustrated in the figure, show that the consumer’s preference varies across the three channels, and six RAs. An organization pursuing a multichannel strategy, can adapt their offerings in each channel to fully utilize these different preferences.
While on most issues VWs are the least appealing from the three channels, framing the comparison with the six relative advantages shows how they have a useful and complementary role to play in multichannel retail. For example, customer support can be done in VWs. An organization, can use these findings to shape their business model and strategy.

Reference
Zarifis A. (2019) ‘The six relative advantages in multichannel retail for three-dimensional Virtual Worlds and two-dimensional websites’, Proceedings of the 10th ACM Conference on Web Science, June 19–21, Boston, USA, pp.363-372. Available from: https://dl.acm.org/doi/pdf/10.1145/3292522.3326038

Dr Alex Zarifis

Several countries’ economies have been disrupted by the sharing economy. However, each country and its consumers have different characteristics including the language used. When the language is different does it change the interaction? If we have a discussion in English and a similar discussion in German will it have the same meaning exactly, or does language lead us dawn a different path? Is language a tool or a companion holding our hand on our journey?

This research compares the text in the profile of those offering their properties in England in English, and in Germany in German, to explore if trust is built, and privacy concerns are reduced, in the same way.

Figure 1. How landlords build trust in the sharing economy

The landlords make an effort to build trust in themselves, and the accuracy of the description they provide. The landlords build trust with six methods: (1) The first is the level of formality in the description. More formality conveys a level of professionalism. (2) The second is distance and proximity. Some landlords want to keep a distance so it is clear that this is a formal relationship, while others try to be more friendly and approachable. (3) The third is ‘emotiveness’ and humour, that can create a sense of shared values. (4) The fourth method of building trust is being assertive and passive aggressive, that sends a message that the rules given in the description are expected to be followed. (5) The fifth method is conformity to the platform language style and terminology that suggests that the platform rules will be followed. (6) Lastly, the sixth method to build trust is setting boundaries that offer clarity and transparency.

Privacy concerns are not usually reduced directly by the landlord as this is left to the platform. The findings indicate that language has a limited influence and the platform norms and habits have the largest influence. We can say that the platform has choreographed this dance sufficiently between the participants so that different languages have a limited influence on the outcome.

Reference

Zarifis A., Ingham R. & Kroenung, J. (2019) ‘Exploring the language of the sharing economy: Building trust and reducing privacy concern on Airbnb in German and English’, Cogent Business & Management, vol.6, iss.1, pp.1-15. Available from (open access): https://doi.org/10.1080/23311975.2019.1666641

The interest in Non-fungible Tokens (NFTs) has ‘exploded’ recently, but it is not clear what final form they will take. This innovation will have difficulties reaching a wider audience until more clarity is achieved on two main issues: What exactly are the NFT business models, and how do they build trust. The findings of recent research (Zarifis and Cheng, 2022), illustrated in figure 1, show that there are four NFT business models:

(1) The first business model is an NFT creator: They can create digital art that is then minted as an NFT, and sold on an NFT platform. The NFT competitive advantages include having proof of irrefutable ownership, and the ability to sell a piece of art that is unique or limited to a low number. The reliability and transparency of the NFT, build trust with the consumer.

Figure 1: The four NFT business models

(2) The second business model is an NFT marketplace, selling creators’ NFTs: The competitive advantage of NFTs as part of this business model is once again the irrefutable ownership, and that it gives consumers digital art they can own. The purchase history of the consumers is transparent, so this gives insights into their interests. As with the previous business model, a community and trust are built between the collectors.

(3) The third business model is a Company offering their own NFT, typically a fan token: This business model has several NFT processes. These are to sell NFTs for profit, to give NFTs as rewards, make payment with fan tokens, give an NFT so that the person receiving it has certain utilities and rights, such as voting rights. The competitive advantages of NFTs, within this business model, are that they allow fans to feel closer to their team and builds a community and trust between the fans.

(4) The fourth business model is a Computer game with NFT sales: There can be in-game purchases of NFT minted virtual items, limited or unique in game purchases and players can be rewarded for playing, know as ‘play to earn’. This offers incentives to game developers to continue producing rare items, provides an ongoing revenue stream for existing games, and builds a community and trust between the players.

This research was the basis of Dr Alex Zarifis keynote speech in front of around 300 people at the 2022 JEBDE’s 2nd Academic Conference on Electronic Business & Digital Economics on the 28/09/22.

Reference

Zarifis A. & Cheng X. (2022) ‘The business models of NFTs and Fan Tokens and how they build trust’, Journal of Electronic Business & Digital Economics, vol.1, pp.1-14. Available from: https://doi.org/10.1108/JEBDE-07-2022-0021

Dr Alex Zarifis

Ransomware attacks are not a new phenomenon, but their effectiveness has increased causing far reaching consequences that are not fully understood. The ability to disrupt core services, the global reach, extended duration, and the repetition of these attacks has increased their ability to harm an organization.

One aspect that needs to be understood better is the effect on the consumer. The consumer in the current environment, is exposed to new technologies that they are considering to adopt, but they also have strong habits of using existing systems. Their habits have developed over time, with their trust increasing in the organization in contact directly, and the institutions supporting it. The consumer now shares a significant amount of personal information with the systems they have a habit of using. These repeated positive experiences create an inertia that is hard for the consumer to move out of. This research explores whether the global, extended, and repeated ransomware attacks reduce the trust and inertia sufficiently to change long held habits in using information systems. The model developed captures the cumulative effect of this form of attack and evaluates if it is sufficiently harmful to overcome the e-loyalty and inertia built over time.

Figure 1. The steps of a typical ransomware attack

This research combines studies on inertia and resistance to switching systems with a more comprehensive set of variables that cover the current e-commerce status quo. Personal information disclosure is included along with inertia and trust as it is now integral to e-commerce functioning effectively.

As you can see in the figure the model covers the 7 factors that influence the consumer’s decision to stop using an organization’s system because of a ransomware attack. The factors are in two groups. The first group is the ransomware attack that includes the (1) ransomware attack effect, (2) duration and (3) repetition. The second group is the E-commerce environment status quo which includes (4) inertia, (5) institutional trust, (6) organizational trust and (7) information privacy.

Figure 2.  Research model: The impact of ransomware attacks on the consumer’s intentions

The implications of this research are both theoretic and practical. The theoretic contribution is highlighting the importance of this issue to Information Systems and business theory. This is not just a computer science and cybersecurity issue. We also linked the ransomware literature to user inertia in the model.

There are three practical implications: Firstly, by understanding the impact on the consumer better we can develop a better strategy to reduce the effectiveness of ransomware attacks. Secondly, processes can be created to manage such disasters as they are happening and maintain a positive relationship with the consumer. Lastly, the organizations can develop a buffer of goodwill and e-loyalty that would absorb the negative impact on the consumer from an attack and stop them reaching the point where they decide to switch system.

Dr Alex Zarifis presenting research on ransomware

References

Zarifis A., Cheng X., Jayawickrama U. & Corsi S. (2022) ‘Can Global, Extended and Repeated Ransomware Attacks Overcome the User’s Status Quo Bias and Cause a Switch of System?’, International Journal of Information Systems in the Service Sector (IJISSS), vol.14, iss.1, pp.1-16. Available from (open access): https://doi.org/10.4018/IJISSS.289219

Zarifis A. & Cheng X. (2018) ‘The Impact of Extended Global Ransomware Attacks on Trust: How the Attacker’s Competence and Institutional Trust Influence the Decision to Pay’, Proceedings of the Americas Conference on Information Systems (AMCIS), pp.2-11. Available from: https://aisel.aisnet.org/amcis2018/Security/Presentations/31/

By Alex Zarifis and the TrustUpdate.com team

The capabilities of Artificial Intelligence are increasing dramatically, and it is disrupting insurance and healthcare. In insurance AI is used to detect fraudulent claims and natural language processing is used by chatbots to interact with the consumer. In healthcare it is used to make a diagnosis and plan what the treatment should be. The consumer is benefiting from customized health insurance offers and real-time adaptation of fees. Currently the interface between the consumer purchasing health insurance and AI raises some barriers such as insufficient trust and privacy concerns.

Consumers are not passive to the increasing role of AI. Many consumers have beliefs on what this technology should do. Furthermore, regulation is moving toward making it necessary for the use of AI to be explicitly revealed to the consumer (European Commission 2019). Therefore, the consumer is an important stakeholder and their perspective should be understood and incorporated into future AI solutions in health insurance.

Dr Alex Zarifis discussing Artificial Intelligence at Loughborough University

Recent research  at Loughborough University (Zarifis et al. 2020), identified two scenarios, one with limited AI that is not in the interface, whose presence is not explicitly revealed to the consumer and a second scenario where there is an AI interface and AI evaluation, and this is explicitly revealed to the consumer. The findings show that trust is lower when AI is used in the interactions and is visible to the consumer. Privacy concerns were also higher when the AI was visible, but the difference was smaller. The implications for practice are related to how the reduced trust and increased privacy concern with visible AI are mitigated.

Mitigate the lower trust with explicit AI

The causes are the reduced transparency and explainability. A statement at the start of the consumer journey about the role AI will play and how it works will increase transparency and reinforce trust. Secondly, the importance of trust increases as the perceived risk increases. Therefore, the risks should be reduced. Thirdly, it should be illustrated that the increased use of AI does not reduce the inherent humanness. For example, it can be shown how humans train AI and how AI adopts human values.

Mitigate the higher privacy concerns with explicit AI

The consumer is concerned about how AI will utilize their financial, health and other personal information. Health insurance providers offer privacy assurances and privacy seals, but these do not explicitly refer to the role of AI. Assurances can be provided about how AI will use, share and securely store the information. These assurances can include some explanation of the role of AI and cover confidentiality, secrecy and anonymity. For example, while the consumer’s information may be used to train machine learning it can be made clear that it will be anonymized first. The consumer’s perceived privacy risk can be mitigated by making the regulation that protects them clear.

References

European-Commission (2019). ‘Ethics Guidelines for Trustworthy AI.’ Available from: https://ec.europa.eu/digital

Zarifis A., Kawalek P. & Azadegan A. (2020). ‘Evaluating if Trust and Personal Information Privacy Concerns are Barriers to Using Health Insurance that Explicitly Utilizes AI’, Journal of Internet Commerce, pp.1-19, Available from (open access): https://doi.org/10.1080/15332861.2020.1832817

This article was first published on TrustUpdate.com: https://www.trustupdate.com/news/are-trust-and-privacy-concerns-barriers-to-using-health-insurance-that-explicitly-utilizes-ai/

Author: Dr Alex Zarifis, FHEA

1) Why trust is important

Trust is necessary whenever there is risk. This means it is more important in some contexts than others. While trust has been researched for many decades, it became a more prominent concern with the introduction and expansion of the Internet. The loss of face to face interaction raised the perceived risk and the importance of trust. Once solutions were found, to reduce the risk and build trust, this became a smaller challenge.

Insurtech is another phenomenon where concern about trust is increasingly important so trust must be explored. Indeed, trust emerges as a problem whenever there is a new widely-adopted technology, like blockchain, 5G or AI. For example, chatbots or virtual assistants that utilize AI are widely used to interact with the person purchasing insurance or making a claim (Zarifis et al. 2020). From the consumer’s perspective there are some concerns. It is unclear if they are trusted and how many interactions with the consumer they can replace. 

In this blog I outline the possible constituent factors to support trust in Insurtech. I start with the psychology and sociology of trust, then discuss trust in other areas and trust in AI and data technologies. I then draw these issues together to propose a model of trust in Insurtech.

2) The psychology and sociology of trust

There is literature on trust in many different areas such as business, collaboration and education, but the foundations are usually psychology and sociology. Each specific context such as business or more specifically Insurtech bring with them some idiosyncratic twists on the common themes from psychology and sociology.

Each person has a different physiology and experiences that shape their psychological disposition. Therefore, many models of trust start with this variable (McKnight et al. 2002). In most cases, creating a general model of trust that ignores the different individual disposition is hard to support with the data. Having personally tried to explore and validate models of trust I can confirm that it is usually hard to take this variable out and still have a model that is supported by the data. To put it simply, on the one extreme some people’s default approach is to trust while on the other extreme some people’s default is to mistrust. Most of us are somewhere in the middle. Across various contexts, the psychology of trust is similar as it does not come from the context but from the individual. In other words, someone inclined to trust is this way across several contexts.

The sociological factors influencing trust are not as consistent as the psychological ones because they are influenced by the context to some degree. They are however often similar across similar contexts. These factors can come from the broader society or more specific subsets of society more closely related to the specific context. While we are distinguishing between the psychology and sociology of trust, it is important to clarify that these two shape each other over time and this interaction depends on the specific instance of an interaction.

3) Trust in other areas

One prominent model of trust in e-commerce, widely considered to be the seminal paper bringing trust theory into e-commerce and information systems, showed how dispositions to trust combined with contextual factors created trust (McKnight et al. 2002). Once trust was brought into e-commerce and information systems it has been adapted to several contexts, so that it captures the consumer’s perspective accurately. My more recent research on trust has identified that in a multichannel retail environment including physical stores, 2D websites and 3D websites, trust can be built and transferred between channels (Zarifis 2019). Trust in blockchain based transactions like Bitcoin were found to combine those from e-commerce with some specific characteristics of this technology such as the digital currency, the intermediary and the level of regulation and self-regulation (Zarifis et al. 2014).

The examples we have seen so far involve a payment which puts a monetary value at risk. Trust is also necessary in other contexts however where there is no monetary value involved. For example in online collaboration it evolves over several stages and the interaction can be shaped with specific activities to reinforce it (Cheng et al. 2013). Another example where trust is important despite no monetary value being exchange is education. For example in virtual and semi-virtual teams, non-homogenous groups need to be supported more so that they can build and sustain a stable trust (Cheng et al. 2016).

4) Trust in AI and data technologies

Figure 1. The 3 levels of visibility of technologies from the consumer’s perspective

The introduction outlined why trust in Insurtech is important and how trust evolves. However, the consumer engaging in Insurtech already has some experience and beliefs in its constituent technologies. As we have seen in the second section the consumer’s trust evolves depending on what technologies they interact with. For example, while purchasing insurance online with a chatbot may be a new experience, they may have interacted with chatbots before. Someone who uses a virtual assistant in their home and experiences the interaction, and how their data is used, will have some beliefs on this issue. While AI dominates the headlines other data technologies are also important. Each technology raises different issues. For example, blockchain technologies were designed to build trust but there are people that distrust them more than the existing alternatives. For some, blockchain technologies and a decentralized ledger reduce risk, while for others a traditional database controlled by one organisation is less risky.

Therefore, we must understand the consumer’s perspective on the constituent technologies of Insurtech. Unfortunately, this is made harder by the different visibility of each of these technologies. Some are fully visible, like a chatbot, others are not visible, but consumers know they are there, and others are mostly unknown to the consumer. The three levels of visibility are illustrated in figure 1. The technologies that are visible to the consumer and understood by them, can be seen as the ‘tip of the iceberg’ of what is actually used in the process of purchasing insurance or making a claim.

5) Trust in Insurtech

Figure 2. A model of trust in Insurtech

The role of technology in insurance is increasing and this is reflected in the increasing popularity of the term Insurtech. This term only emerged recently but it is now widely used in the insurance and technology sectors. AI driven automation, utilizing additional technologies such as big data, Internet of Things (IoT), blockchain and 5G is making the role of technology even more central than it was before. What is trust in Insurtech and is it different to other forms of trust? The first step to answering this question is to attempt to identify its constituent parts. My starting point is that Insurtech is formed by (1) Individuals psychological disposition to trust, (2) Sociological factors influencing trust, (3) Trust in the insurer and (4) Trust in the related technologies (e.g. AI). This relationship is illustrated in figure 2. Further research is needed to empirically test and validate this model. It must also be explored if additional factors like law and regulation act like separate variables or moderate these relationships. The long journey of insurers, their consumers and AI has just started and trust in each other is needed for it to be harmonious.

References

Cheng X, Fu S, Sun J, et al (2016) Investigating individual trust in semi-virtual collaboration of multicultural and unicultural teams. Comput Human Behav 62:267–276. doi: 10.1016/j.chb.2016.03.093

Cheng X, Macaulay L, Zarifis A (2013) Modeling individual trust development in computer mediated collaboration: A comparison of approaches. Comput Human Behav 29:1733–1741.

McKnight H, Choudhury V, Kacmar C (2002) Developing and Validating Trust Measures for e-Commerce: An Integrative Typology. Inf Syst Res 13:334–359.

Zarifis A (2019) The Six Relative Advantages in Multichannel Retail for Three-Dimensional Virtual Worlds and Two-Dimensional Websites. In: Proceedings of the 11th ACM Conference on Web Science, WebSci 2019. Boston, MA, pp 363–372

Zarifis A, Efthymiou L, Cheng X, Demetriou S (2014) Consumer trust in digital currency enabled transactions. Lect Notes Bus Inf Process 183:241–254. doi: 10.1007/978-3-319-11460-6_21

Zarifis A, Kawalek P, Azadegan A (2020) Evaluating If Trust and Personal Information Privacy Concerns Are Barriers to Using Health Insurance That Explicitly Utilizes AI. J Internet Commer. doi: 10.1080/15332861.2020.1832817

A version of this article was published on the Technology Driven Change and Next Generation Insurance Value Chains (TECHNGI) research project website: https://www.techngi.uk/2020/11/15/what-is-trust-in-insurtech-and-is-it-similar-to-trust-in-other-areas/

#trust #insurance