For example, smartwatches or fitness trackers. Digitalisation on the body.

Healthy and democratic digitalisation is technology serving people. This sentence expresses our hope and curiosity about new developments, but also gives space to our concerns – at what point do people serve machines instead of the other way around? When it comes to tracking and the digitisation of the body, these questions quickly arise. But because smartwatches and trackers are popular, dealing with them is also a good way to start learning more about digitalisation.

Other language version: German.

Fundamental questions can sometimes be softened if you look closely enough at who is tracking themselves nowadays, sometimes occasionally, sometimes regularly. Smartwatches, smartphones and modern platforms have made tracking, rating and comparing truly common. At the same time, there is no shortage of technological naivety. The promise of democracy that service providers and producers of smart technology make by proclaiming the democratisation of digital progress, because more and more people could participate in their complex services, is quite superficial. Shouldn’t we actually say the other way round: the industry may possibly, and if it wants to, participate in the data that belong to the buyers of its products?

So what are we to make of this measurement stuff, many are asking. This is actually a good sign. Because a reflective use of technology begins when people carefully ask themselves the questions about its risks and possibilities. After the phase of initial euphoria and promises, the hype, some gained a more rounded picture from their own experience and from exchanges with others.

However, since the promise of Body Machine Interaction (BMI) also comes with high risks, we must all have an interest in people learning more about this technology and its application, overcoming prejudices, recognising critical aspects, eliminating dangers and threats. As with any technological innovation, we must not rely on the brochures and engineers. As users of such technology, we need to ask ourselves: to what extent does it help me, or to what extent does it create new dependencies and problems?

Ubiquitous computing: devices become social actors

Devices that are able to track and measure close to the body are an important building block for implementing the idea of ubiquitous computing. The vision behind this is to integrate devices into our everyday lives in such a way that they interact with us as social participants in a new form. Smartphones have shown how this idea can change our relationship with computers and how ubiquitous data transmission can make operating a computer convenient and intuitive.

In the case of tracking tools, fitness apps or cycle apps, it is obvious how they serve a personal need of their users. Although these things could have been digitised before the era of Big Data and mobile connectivity, the new way of processing data offers other advantages: We can share information with others, measure ourselves against other people, merge information from the past and the present and thus also gain insights into our future behaviour. And the most important thing: you don’t have to switch on a computer, run a programme, export and import a .csv file and so on. Everything is seemingly done automatically and quickly. Such tracking and measuring apps actually democratised the quantified self as well. What would Olympians have paid for such tools back then, and we get them for free today!

A practice where “a person actively measures themselves with devices and apps to generate knowledge based on the results of analysis to help optimise their lifestyle and behaviour in the areas of fitness, wellness or health.”

Meidert et al., 2018, p. 42

A framework for “personal science” understood as a “practice of using empirical methods” to explore personal questions through data and digital devices.

So do tracking tools and apps show us that ubiquitous computing has won and is superior to all other visions of computing? Not quite, because there are reasons for a more critical view. If the claim is that technology serves people, the yardstick of consideration must be the extent to which BMI apps and tools enable people in the sense of empowerment::

Empowerment through technology

Improvement of skills
Solving problems
Contribution to self-determination, participation and autonomy
Giving people power of disposal and control over their data, devices and digital identity

A socio-technical view of BMI

The smart sock drawer

For small devices to become powerful and competent, they first rely on connectivity to larger data centres. To harness their ubiquity for analysis, a lot of data needs to be processed, shared by smart speakers, watches, light bulbs, pacemakers, but not in the device, but somewhere else. Whereas our grandparents used to have notebooks in their sock drawer where they wrote down their body temperature and number of push-ups every day, today’s sock drawer is on various servers in Europe, Asia and North America. Be careful that nosy neighbours don’t uninvitedly check out what colour your socks are or what else you want to hide from them.

Control your own intimate data

Secondly, this data flow is not under the control of the people who are responsible for it. We are confined to a role defined by the providers of the technology as both data producers and users. In order to make this dual character visible, in the following we will not only speak of users, consumers, but of producers.

PRODUCER” and “USER” can often not be clearly separated in the digital world. With our bodies, we produce the data on which the services depend. We use the knowledge that is gained from it. The boundaries are blurring and that is why some propose “PRODUSER”. Only one question remains unanswered: to what extent are we also “OWNER”?

Techno-optimists like those in the quantified self scene are not criticising ubiquitous computing per se, but experimenting with ways to free devices and data from their proprietary cages and give users control. Others want to more tightly regulate the presence and nature of tracking and measuring in daily life. They are concerned about privacy, especially when data is generated in the most private areas and spaces. They also demand more overview of what is being done with the data. Especially what the service provider is doing – what data is being processed through what analytics model and with what other data – what the service is doing with the personal data without us produsers knowing about it. Unfortunately, there is currently no device on the market that would immediately allow you to limit the flow of data to your own network or device and not share the data with others.

Sharing – with whom else?

Thirdly, it becomes relevant whether the insights gained from produser data are also shared with c) third parties beyond the a) produser and b) provider. Last but not least, systems that use Big Data rely on a constant flow of many and data. As a result, fitness apps and tracking tools try to get users to use them as much as possible. Good analytics models and predictions require limiting the generation of false data (e.g. cheating) or no data (turning off tracking features or not using devices). One way to achieve stronger engagement is gamification, which means transferring game-like and game elements into a different context (e.g. colourful, moving, interactive elements), and nudging (e.g. prompts and motivational, seemingly personal messages).

New addictions

It is plausible that gamification can tempt sensitive or abuse-prone individuals into unhealthy behaviour. More fundamental critiques fear the implications for society: being tracked and monitored may become an unquestionable norm and, if it eventually becomes part of our habitus, may increasingly diminish the possibility of resisting this technology. Just as the public ‘rating’ of others has become the norm through the rating systems ‘offered’ or imposed by digital platforms. Or how children who grow up with smart dolls or smartphones monitoring them on behalf of their parents are accustomed to digital surveillance from an early age.

Looking more generally at our digital identities today, those digital representations of personality that we continuously create, one has to consider that especially the devices close to the body generate a vast amount of personal data. Curating one’s own identity on the internet thus becomes all the more difficult.


We need to assume safety. In particular, people who need to rely on Body Machine Interaction (BMI), e.g. pacemakers, cochlear implants or brain stimulators, but also non-invasive forms such as smartwatches and fitness trackers, must be able to trust 100% in the integrity and in scientifically valid analysis models.

Researchers such as Ienca & Haselager (2016) have developed various scenarios of how perception and the brain could be hacked. However, their findings can also be applied to other types of connection between humans and machines. Devices can of course be hacked directly. But more threatening, because easier, is indirect manipulation, by attacking and manipulating the connection between device and analysis. Equally dangerous and even less controllable are common software problems that can be caused by incorrect or insufficiently carefully tested updates.

Everyday problems are already raised by machine ethics, but far from answered: Who is responsible for actions triggered by human-machine interaction, what control mechanisms need to be built in so that a human can control an automatic action performed by a system, etc. (Clausen et al., 2017) It would be too short-sighted to think only of brain implants or futuristic things here. Even autonomous cars, service robots or normal smart prostheses are moving on ambiguous terrain here.

Self-learning and digital education

Do the negative aspects of these new BMI devices ultimately outweigh their advantages? To answer that, one must first and foremost learn about and also weigh the pros and cons. The companies that make their money from BMI have so far not shown that they feel responsible for it. So produsers:inside have to become active themselves and be better supported by digital education, consumer protection organisations, digital civil society and the media.

Take a realistic look at one’s own tracking preferences

A study from Switzerland (Meidert et al., 2018) took a closer look at how different people use these tools and apps. They conclude, as already described in our brochure on the Digital Self (Zimmermann, 2021, p. 38) that the Quantified Self strengthens people in their body awareness (42%) and helps them to observe their body (27%). Therefore, it would open up an opportunity for a better life (30 %). On the other hand, many have privacy concerns (31 %), rate data-driven self-observation negatively as it undermines the natural competence to observe oneself (21 %) and/or criticise the non-accurate measurement and consider it a gimmick (24 %).

Less healthy people, on the other hand, are more reserved. They generally measure when they have to, be it for prevention, to prepare for a consultation or because they are ill. Especially among unhealthy people, concerns are growing. From their interviews, the researchers conclude “that there is a general fear of being discriminated against or disadvantaged in the future because of lifestyle choices” (Meidert et al., p. 84).

In general, people measure: steps (63%), weight (26%), pulse (26%), calories (26%), menstruation (23%), sleep (21%), stairs (15%), other parameters (15%) and blood pressure (9%) (p. 81). They mainly use their smartphones with an app (62 %), traditional devices (26 %), activity trackers (25 %) and smart watches (17 %) (ibid., p. 84).

It seems empirically proven that there is an upwardly mobile and a performance-oriented milieu of frequent trackers for whom the body plays an important role as symbolic capital. But this does not represent the majority of respondents by far. Moreover, competitive reasons certainly do not play a role in the use of cycle apps, for example. So instead of being put off, it is worth taking a closer look.

  • Think about which platforms, apps and devices you are tracking and which ones are collecting measurement data through you.
  • How do you deal with it? Are you curious and try out all the features? Do you use your device continuously?
  • Do you think about risks while using it?
  • Do you generally enjoy data?
  • Do you try to cheat your device from time to time?

Which of the types identified by the Swiss researchers are particularly close to you?

1 Anna, the sporty one2 Toby, the tech-savvy one
Tracks while running to measure distance, time and pulse as well as progress. She wants to train optimally and prepare for her training goal. Otherwise she doesn’t track at all.Is curious about the technology, what it offers and how it can be used. Enjoys tracking. Has many devices and gadgets that he uses and tests.
3 Tamy, the diabetic4 Juan, the critical non-user
Because she has a chronic illness, she tracks. She measures her blood sugar levels very often during the day to stay healthy.Does not track because he does not want his data to be available to others. Doesn’t want to know too much about himself either, but rather lives according to his feelings.
5 Klaudia, the frequent tracker6 Steven, the step counter
To reduce her weight, she systematically measures many parameters and derives knowledge from them. She wants to become aware of things through numbers.Tracks steps with a pedometer to stay fit, especially at his age. He sets himself daily step goals.
More on the profiles and source:
Meidert et al, p. 92 ff.

Empowerment through technology?

Do the data and analysis help you to…+5+30-3-5Sum
Improve skills?      
Solve problems?      
Improve self-determination, participation and independence?      
Increase power of disposal and control over your data and devices?      

App and device check

In tests and reviews of tracking apps and devices and of social platforms, these aspects are unfortunately almost not mentioned at all. One thing would be to push for the above aspects to be included and to give more weight to the conditions for sharing and data analysis in purchasing decisions.

In addition, we as a society need to discuss which forms of tracking or data sharing should be banned or can be replaced by alternative privacy-friendly alternatives: For example, that analysis can be done without internet access, that data is openly available to me so that I can also read it with other programmes, that I can decide to whom it is available, in what form, for how long and for what purpose.

  • None of the smartwatches tested in 2019 could convince the German Stiftung Warentest with regard to data protection. However, this was included in the ratings with just 10%.
  • For Android users, the open-source app Gadgetbridge allows some fitness trackers and smartwatch models to be used without an account with the manufacturers and without an internet connection.

For what added social value am I willing to share personal data?

The COVID-19 apps all over Europe have demonstrated the benefits of body-based technology for the public. There are also other scenarios where sharing personal data can be in one’s own interest and in the public interest. For example, in pandemics, in disaster situations, or to enable health researchers to better align their research with societal needs or to put their theories to the empirical test. What would you disclose and in what form?

Concerns second?

One example of how things cannot be done is certainly the electronic patient file in Germany, the federal project of a digital health sector. Although it is a mammoth project, shortly before its introduction at the beginning of 2022, it still does not allow patients without suitable devices to determine with document accuracy which health personnel may view which data. In the worst case, the orthopaedist can see the psychotherapeutic findings from 20 years ago. It is also not protocolled who had access to which document (BfDI). It would be good if we didn’t have to rely on such grotesque misplanning being uncovered little by little by voluntary activists such as from the Chaos Computer Club or by data protection commissioners almost by accident.

A fortiori, in privately organised projects, we cannot hope that privacy, confidentiality, security or control enjoy the highest priority. Moreover, further development tends to be driven by economic rather than medical interests (Bertelsmann Stiftung, 2016).

Nowadays, privacy is to be understood much more broadly than just the protection of data against manipulation or access by non-authorised persons. In order to protect their privacy, users must also be able to control data and withdraw it from the market or others. Others, however, as described above, believe that privacy is protected as long as the treasure trove of health data is effectively anonymised and can thus be made available for research.

What may be good for me may be harmful to others. Imagine insurance companies could make their wishes come true and their rates were to be tied to tracking. Why? Because risks can be measured much more precisely. Of course, healthy and low-risk people will benefit, but vulnerable groups will pay massively for it and be forced to justify their lifestyles to algorithms. In this sense, we are not only making decisions for ourselves, but also for others.

Let’s talk about digitalisation on the body

Although many feel unease when they think of tracking, measuring, algorithms in connection with body data or in connection with implants and prostheses, I hope that this post was able to arouse some curiosity.

It is precisely the vast amounts of personal data and the increasing number of sensors around each of us that will have to prove improved (European) regulation. This will show whether the European path to digitalisation will be significantly different from the North American or Chinese path.

At the same time, many people associate existential hope with the possibilities of this form of digitalisation, which can help them to expand their skills in a self-determined way. This is definitely too interesting a topic to be left to the engineers alone!

Last but not least, it should be noted that although many people are uncomfortable thinking about the topic, even more are interested in it or already practice the quantified self in small ways. Thus, their experience is a competence that (political) education and digital literacy have not really taken into account so far. Perhaps this is changing?


Bertelsmann Stiftung (2016). Health Apps, Spotlight Healthcare. Gütersloh.

Bundesbeauftragter für Datenschutz und Informationsfreiheit (BfDI): Die elektronische Patientenakte (ePA)

Clausen, J.; Fetz, E., Donoghue, J.; John Donoghue, Ushiba, J; Chandler, J.; Spörhase, U.; Birbaumer, N.; Soekadar, S. (2017). Help, hope, and hype: Ethical dimensions of neuroprosthetics. Science 2017/06/30, p.1338 f.

Ienca, M; Haselager, P. (2016). Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity. In Ethics Inf Technol (2016) 18: 117.

Meidert, U.; Scheermesser, M.; Prieur, Y.; Hegyi, S.; Stockinger, K.; Eyyi, G.; Evers-Wölk, M.; Jacobs, M.; Oertel, B.; Becker, H. (2018). Quantified Self – Schnittstelle zwischen Lifestyle und Medizin. TA-SWISS 67, Zürich.