Watch this. Understand this.

For quite a while now I really think we are at a crucial point in the history of the web.

In the weeks before this year's event, I thought back at my first visits to the Beyond Tellerrand Conference, how much it impressed me and where I was exposed to a real paradigm shift right at the start of the "responsive" thing.
In the following years, the talks and general trends got more sophisticated, and myriads of possible solutions to new problems were presented, but there wasn't this "big bang" moment (for me) like it happened with the RWD stuff.
After "we" came finally around to accept the never-ending diversity of devices and situations in which our work should work, over the last years the realisation that our users and their (dis)abilities are more important than technological restraints or pure aestethics, followed.
Ultimately on the web, the discoverability of content always trumps visual presentation, and so a11y has to be baked into the processes right from the start - we know this. Well, at least in that part of my filter bubble that is dear to me.

But NOW is the time to step back and see how the underlying web has changed. Where the data-mining industry, the "social" network giants, the connected devices everywhere and the ever growing influence of attention-hungry algorithms has brought us.

An industry that has enabled genocide. Literally enabled genocide. We’ve got to be better than this. And it’s a pretty fucking low bar not enabling genocide. Acceptance criteria: did I enable genocide? Yes or no.

as Charlie Owen put it in her talk that opened the conference and in parts anchored what this closing talk here brought home.

If we live in an information society, what is the power of an algorithm that is trained to always feed you that kind of information that triggers the most engagement, filtering out stuff that could challenge your point of view? An algorithm that has access to your interests, locations, health, wishes, habbits, and taste by tracking, recording, listening, and analyzing your actions online (and offline through 'smart' devices)? That directly speaks to that part of your brain that is thinking like a teenager who just discovered masturbation self-care?

Algorithms smarter than our ability to outthink them

What if all the signals that you are receiving are pre filtered to confirm what you already thought was right? Isn't that how totalitarian regimes throughout history have cemented their power?

I think this is the next 'RWD' moment - using social networks and silo platforms does hurt the web. It damages the information environment. The concentration of information on a few hubs that are controlled by corporations that got too big and too rich to fail by selling all that data they receive in trade for a 'free' experience… and us feeding them with smart devices and by using their apps:

It hurts the web, and it hurts society.

Here we are with that amazing technology, which got transformed into this seemingly unstoppable hate generating machine instead of a place where education and access to the collective knowledge of the world is possible.

Yes, the message of this rather sombre closing talk of this year's Beyond Tellerrand Conference Düsseldorf is important. Watch it. And then go out, take care of yourself and others, away from the screen. And then come back and publish your own stuff on your own site.
Still not convinced?
- Ok, then, please read Matthias Ott's great article (published on his own site btw), and *then* start using your own site.