Le European Commissioner Thierry Breton recently announced on the set of Daily that the European digital services law will force tech giants to open the hood of their algorithms so that a committee of experts appointed by the commission can analyze them. To do this, he says to solicit up to one hundred and fifty scientists and engineers. A priori, we would prefer that the money spent by Europe on these audits – free of charge for these giants – is preferably invested in research. But this is ignoring the clumsy announcement effect of the European Commissioner, because the recently voted text is much more subtle and intelligent than the simple unconditional systematic audit. Watch out for the media buzz!
There will be agreement on the importance of building a relevant regulation that protects the fundamental rights of Europeans while encouraging innovation. But the Commissioner’s comments suggest that any the algorithms of any the technical actors will be evaluated – free of charge for their owners, but of course financed by our taxes.
TO READ ALSODigital services regulation: can do better!
In fact, the law that has just been passed, the DSA (Digital Service Act), requires Gafams – explicitly covered by the conditions of application of the text –, in the event of incidents and at the request of the legislator, to make available the algorithms at the origin of the incident in question. It is indeed clear that there is no shortage of algorithmic scandals of recent years, unlike the solutions – almost non-existent – of their owners. The text provides between the lines for the establishment of governance over their algorithms. In the event of an incident followed by an algorithmic audit, the penalties incurred are, rightly, at least three times higher than those provided for by the General Data Protection Regulation (GDPR).
Getting out of blindness
Now let’s analyze what can be meant by “looking under the hood” or algorithmic audit. At the risk of upsetting those who defend the publication of the source code in which the algorithm was programmed, this step must be the last brick to be considered in the transparency process. Transparency must above all relate to the development, testing and use practices of the algorithm. Because the source code does not explain the arbitrary choices of designers on the algorithm, the dataset, or the way to test and validate them. And therefore does not allow to fight the reasons for repeated scandals. In addition, future texts should require actors to apply statistical methods – whatever they may be – to extract the logic of the algorithm. We also talk about explainability calculation. We would then finally get out of a blind – yet so desired – on the part of the owners of these tools.
Words have meaning. Transparency, explainability, algorithms, or even source code are terms to be manipulated with precision to achieve the visionary objective of Europe. And it is through this mastery of the technical and scientific verb that legislators and parliamentarians will set an example. Because it would indeed be a shame to get out of the bubble effect to be locked into a buzz effect.