What ethical approaches are used by scientists when sharing health data? An interview study | BMC Medical Ethics

0

The results describe the ethical approaches adopted by the participating scientists in relation to collecting, using, and sharing health data. In the analysis of this study, four main categories and fourteen subcategories were used to classify the discussions: 1) consideration of consequences (consequentialism), 2) respect for rights (deontological approach), 3) procedural compliance (procedural ethics), and 4) being professional (virtue ethics). An overview of the categories and subcategories is presented in Table 3. In the following, these categories and subcategories will be described and illustrated by quotes.

Table 3 The categories and subcategories of the ethical approaches attributed to the participating scientists in relation to sharing health data.

Category 1: consideration of consequences (consequentialism)

The consequentialist approach was expressed in three different ways: benefit to society, benefit to science, and do no harm to individuals/non-maleficence.

Benefit to society

The respondents argued that sharing individual health data benefits society in one way or another: data sharing may help explain the origin of diseases, be useful for the population due to the new development of treatments and save lives. Some respondents expressed that the more collaboration there is with academic scientific research and other academic projects or even with commercial companies developing new medical devices or drugs, the more benefit there will be to society. A concern was raised that excessively strict legal rules do not benefit society, as they may hinder beneficial research and technical development. The respondents pushed for open data sharing in the research community as a means to improve health care in the future.

Another consequentialist argument given for the wide reuse of health data was the need to maximize benefits to society by maximizing data reuse as much as possible. It was argued that people’s tax money should be used in a manner that is beneficial for the population. Because hospitals, universities, and national consortia in Sweden are financed with tax money, the data they collect or generate need to be widely used to maximize their benefits.

[…] when data are produced with tax money, for example, then they [tax payers] want them to be used as much as possible. There should be no obstacles to that, so that each [taxpayer] gets as much benefit as possible from the data that have been produced. They must then be shared with more scientists and so on. (Respondent 1, data manager)

Benefit to science

The respondents stressed that sharing data is not only beneficial for society but also a necessity in relation to answering certain research questions. Data sharing and reuse increase the possibility of making new discoveries. The respondents expressed the view that if the applicable legal rules are too strict, the law will hinder important research.

I understand that there are reasons for laws and stuff like that, but […] it is often a hindrance I think, being constrained in what you are allowed to do in this way. (Respondent 12, scientist in applied mathematics)

Since data collection is expensive, it was expressed that the research community needs to preserve existing data. Reuse is perceived as a way to make good use of pre-existing data. Even within this subcategory, two respondents expressed that they saw no great risk for harm. Rather, their view was that the good consequences for improved research outweigh the risk of people being harmed. They could imagine potential negative consequences, but they perceived this risk to be highly unlikely. Some of the respondents expressed enthusiasm about having access to data and the freedom to perform research. They were encouraged by these aspects and expressed a strong motivation to perform data-intensive research tasks.

Another aspect of sharing data responsibly is that science needs a good reputation to maintain trustworthiness. If the scientific sector is perceived as trustworthy, it has a good basis for participant recruitment and retainment. This aspect is also important in terms of maintaining people’s confidence in the research community for financial reasons since most research funding comes from taxpayers.

Do no harm to individuals/non-maleficence

The majority of the respondents acknowledged that there are threats to participants’ private spheres in the form of bad consequences if their personal health information falls into the wrong hands. They recognized that data can be lost and end up in the wrong hands; people can be identified, and data can be misused. People can sell these data or earn money by blackmailing individuals, threatening to disseminate their information about being in a risk group or having a certain disease. Insurance companies were mentioned as entities with an interest in this kind of information. Therefore, there is a risk that data ending up in the wrong hands can give people an economic disadvantage. Some of the respondents viewed this as very hypothetical, but they acknowledged that it could happen; therefore, protective measures must be taken to protect participants.

However, two respondents perceived the risk of individuals being harmed to be so small that it almost does not count. One respondent could not see how anyone could be interested in such participants’ health data.

I do not really think that there are very many who are super interested in these data in that way. (Respondent 7, epidemiologist)

People [scientists] are generally very unnecessarily anxious. People sit behind desks and behind paper and are very anxious about… ‘what if data comes out, and what if I do wrong and I do not know exactly what is right or wrong?’ There seems to be a bit of chaos, and GDPR has not exactly made things clearer or eased nervousness. My very personal attitude is that people are a little too anxious for their own health about this. (Respondent 4, project coordinator)

Category 2: respect for rights (deontological approach)

In this category, the respondents described certain rights that need to be respected regardless of the consequences. The respondents explained that data subjects may be displeased and experience bad feelings if their information is in the hands of others. Within this category, we have included the respondents’ views on scientists’ right to perform research in the name of freedom of research.

Right to a private sphere

The respondents strongly emphasized people’s rights to be protected and not to be identified without consent. This was viewed as a matter of respecting other people and their integrity. The respondents expressed that if people’s health data are spread and fall into the wrong hands, their personal integrity is violated. Another opinion supporting this view was that personal data should not be freely accessible to everyone and that people have the right to decide what is known about them. One respondent thought that people are not sensitive to whether their personal information is known by random people but that if a neighbour or colleague knows the same information, it makes a difference. Hence, as it is difficult to determine a person’s level of acceptance to share information and level of vulnerability beforehand, the respondents expressed that actions and security levels need to be such that they meet the needs of the most vulnerable people.

So, in general, I think the risks are quite low, but it is more to protect personal integrity, … some care very much and do not want to give out their social security numbers or do not want to have such information everywhere, and you have to respect that. Then, there are others who do not care. (Respondent 6, epidemiologist)

Autonomy

The respondents explained that people are entitled to make decisions about sharing their health information according to their own wishes; autonomy is the core of participation in research. Participants have the right to decide for themselves whether or not their health data should be shared and with whom. Careful attention should be given to the information stage of research before data are collected so that potential participants are aware of the purpose of the research, the scientific method used, the reuse of their data, and expected result before consenting. It is essential for people to know whether their data will be used by others or for other purposes so that they are able to make an autonomous decision. Some of the respondents believed that the responsibility for the reuse of data lay in potential participants’ agreement. If participants consent to the potential risks and benefits of this process, then they should be trusted to understand what they are doing. People’s different preferences were also mentioned as a reason to leave the responsibility to decide to individuals. Moreover, giving them the opportunity to decide was seen as a way of respecting each participant as a person, the life that he or she has, and his or her experiences.

Who is responsible?… I would say the data subject. So, who is responsible for the data that is shared, yes, it is the individual, […] it is the individual who decides for him or herself. Do I want to accept this, that is, do I want to sign this consent? I want to buy a smart watch. I want a Google Home like this at home. Somewhere, consciously or unconsciously, individuals make a lot of choices in our society. So it must always start from the individual. […] Yes, and some people glide around in some form of ignorance as well. But I do not think that the government should be allowed to take responsibility for the fact that there happens to be ignorant individuals. (Respondent 4, project coordinator)

…and that the patient has agreed and is informed that it is shared and that it is understandable patient information, for example. The patient understands that it is voluntary and that they can say no as well. (Respondent 11, nephrologist)

Freedom

Moreover, the respondents voiced the importance of people being free to choose whether they want to participate in data collection not only to exercise their right to autonomy but also to realize the value of living a free life. It was argued that the state should not regulate everything and protect people from all possible incidents. That was perceived as an undesirable situation for our society.

…it is so dynamic [the technical development], so I feel very sceptical about appointing the state as responsible; I think it is the wrong way to go. One must be able to protect oneself in the first place. I am a little reluctant to leave it to a government to decide what kind of data I may share or what I allow someone else to do with it. (Respondent 4, project coordinator)

Two respondents made the opposite declaration, namely, that not all participants are autonomous. Therefore, they are not free to decide on their own because they are patients and dependent on care (which constitutes a power imbalance). Therefore, they need the protection of external rules that establish a more equal power balance alongside processes and guidelines that make the collection and use of people’s health data safe and rigorous. Thus, some are free thanks to the presence of rules.

Some other respondents focussed on freedom but form the perspective of data users (e.g., researchers) rather than data-subjects. The freedom to perform research was mentioned as an important norm of the research community. This norm is the basis of scientific research, and it needs to be protected; indeed, excessively strict and complicated rules hinder free research. Additionally, such rules were viewed as potential limitations on the level of knowledge that can be achieved, which goes against the goal of doing research.

Human dignity

Human dignity reflects the inherent worthiness of being a human. The respondents expressed the view that respect for human dignity may be less of a concern with the extensive data sets that scientists work with, since an individual becomes just one in a crowd. One reason for this phenomenon could be the social distance between a participant and the user of the corresponding data. The respondents emphasized that it is important to remember as a data collector that there are real people behind data; therefore, it is important to be careful when collecting, storing, and using data.

I think it is important that the scientists who use data and those who share data are aware that there are actually people attached to the information and that you should be very careful about how it is used and stored; there should not be any names and so on. (Respondent 6, epidemiologist)

Keeping promises

Keeping promises and following what has been agreed upon were mentioned in two ways. First, these concepts were described with the presupposition of an agreement being in place with informed consent and nothing hindering the use of people’s health data. If people agree to something, there is no need to question whether it is harmful or wrong, and rules are simply followed. Second, the respondents mentioned the importance of acting according to what has been promised. Indeed, things that have been said should be followed:

[…] that is what people expect from us. (Respondent 8, geneticist)

Justice

Another approach to having high standards in relation to control over participants’ health data that was expressed is that some people are in more vulnerable situations than others. Therefore, justice based on needs was expressed as a requirement when asking for people’s health data.

… there should be very high demands [on how we treat data]. I think there are some people who are weak and extra vulnerable. (Respondent 7, epidemiologist)

In contrast, the idea of justice was also deemed a motivator of openness of data in research and care. This argument is as follows: through open access to data, it is possible to represent all people in scientific research. In addition, a concern was raised that inequalities, which exist in the context of people’s health, will be maintained due to a lack of data representativeness. The concern is that in the long run, there will be an unequal distribution of care. Thus, greater openness in health data sharing will benefit us all, specifically underrepresented groups.

And that is a driving force of why it is so damn important that open sharing and equal sharing of data are as broad as possible […]it is because this group was not represented in the data on which this algorithm was practised… (Respondent 10, medical scientist developing AI-based tools)

On the other hand, entirely open access to data was viewed as problematic from the perspective that the professionals have put labour into the data collection and those who collect and analyse data should be recognized for their accomplishments. Some of the respondents noted the importance of scientists obtaining credit for the work they perform before their data are shared.

You have to get credit for what you have done, but you can still share it with others. (Respondent 3, neuroscientist)

Category 3: Procedural compliance (procedural ethics)

In addition to selecting actions based on what they perceive to be the good and right thing to do, participants focus on the process. Many respondents voiced the need to have good data collection and sharing procedures to ensure professional behaviour. They expressed a desire to do the right thing, but they wanted to decrease the obstacles that hold back beneficial development and make administrative work inefficient. They expressed a desire for a data-sharing routine delineating correct actions; it should be simple to follow rules and not think all the time about what is right or wrong. For example, there should be a practical computer system so that data files do not have to be e-mailed to others and a simple ethical approval system that is comprehensible. Rules implemented to facilitate good procedures were perceived as giving freedom to research and innovate.

[…]but I have gotten emails with social security numbers and addresses for people; this should not happen. I do not want that in my mailbox. (Respondent 6, epidemiologist)

As a new scientist, I think you become overwhelmed, and then I think that many start to cheat. Many do not apply [for ethical approval], but they run their race. (Respondent 8, geneticist)

However, one respondent emphasized that rules need to be flexible to adapt to changing circumstances (e.g., technological development and new research questions). These changes can be difficult to foresee; therefore, the need to be transparent was viewed as important in a changing world. Being transparent with the community and with respondents was also viewed as a good path for an ethically sustainable research environment.

So, this is a balancing act; they [the regulations and processes] must be alive… the ethical regulations must be alive so that they can adapt, because if there is new technology that makes it possible to do something we cannot do today, we must be able to say, you cannot do it this way; you have to do it this way or protect the individual in some way. (Respondent 8, geneticist)

Category 4: being professional (virtue ethics)

Finally, our analysis revealed expressions of being professional and how one should be as a scientist. This dimension was mostly related to the character that data collectors and users have or should have. One view was that scientists (data collectors) are not interested in individuals’ health statuses and do not dissects data on an individual level; therefore, the risk of something going wrong is negligible.

Scientists reported a need to be respectful and responsible, which are considered important virtues of their professions. However, some of the respondents believed that data users can be affected by their interest in discovery and forget or stretch the rules because they are curious.

[…] and then these people start to fumble a bit and hand data over to some company they collaborate with, which they think is very exciting, and so it starts to slide. (Respondent 4, project coordinator)

link

Leave a Reply

Your email address will not be published. Required fields are marked *