Blog – Joanita Nagaba https://joanitanagaba.com Data Protection Specialist Mon, 26 May 2025 13:54:03 +0000 en-US hourly 1 https://wordpress.org/?v=6.6 https://joanitanagaba.com/wp-content/uploads/2024/04/cropped-android-chrome-512x512-1-32x32.png Blog – Joanita Nagaba https://joanitanagaba.com 32 32 What is the fate of countries like Uganda in the Coming Wave? https://joanitanagaba.com/what-is-the-fate-of-countries-like-uganda-in-the-coming-wave/ https://joanitanagaba.com/what-is-the-fate-of-countries-like-uganda-in-the-coming-wave/#respond Wed, 21 Feb 2024 08:13:00 +0000 https://ryancv-demo.bslthemes.com/?p=98

In his book, The Coming Wave, Mustafa Suleyman warns that the nation- state will be subject to massive centrifugal and centripetal forces, centralization and fragmentation. He further warns that this recipe for turbulence will create epic new concentrations and dispersals of power, splintering the state from above and below ultimately [casting] doubt on the viability of some nations altogether. To some, this may sound like another form of fear mongering. Just another successful Tech bro blowing the horn all while fueling the same system – some might sneer.

I cannot comment about the sneers, but I am astounded by Mustafa’s predictions as he unveils the sheer magnitude of the coming wave perpetuated by Artificial Intelligence – adding to the wealth of insights by previous authors including Kate Crawford, Eliezer Yudkowsky, Max Tegmark and many others. Most authors’ predictions swing between doom and glory with a shared thread of an Orwellian future successfully evoking debate among scholars, lawyers, engineers, students, civil society organizations and more. What do we need to do? How can we help? Do we have the skills to rewrite the doom? Can we contain this problem? Do we have the power to address the problem? Do we understand the problem? and Do our governments have the power to address the problem?

Some of the underlying concerns were quite evident at the most recent US Senate Judiciary Committee hearing on online child sexual exploitation. The symbolic nature of the entire room was unsettling. The Senate Judiciary Committee Members comprised of mostly senior citizens sat opposite five Big Tech CEOs while a group of angry parents held placards with photos of their deceased children – whose death was attributed to these companies.

When the room was called to order, the parents sat still with their backs straight up and their eyes firmly held onto the Senate Judiciary Committee – their last hope of justice. “Finally, let’s shame them! The world is finally watching them! The world must know what they have done!” must have been the words on many of their minds.

As the Committee Members berated the Big Tech CEOs, the room echoed with chatter, clapping and standing ovations. “…You have blood on your hands. Your product is killing people” said Senator Lindsey Graham in an unwavering and authoritative voice like a father berating his child. While Mark Zuckerberg looked embarrassed and perhaps dismayed at his tone, he projected firmness. As more Committee Members asked more questions, the CEOs’ responses swung between arrogance and legalese. “I cannot comment on that.” “I don’t agree with that characterization” and so on. Even though the CEOs were put to shame before the world, they didn’t seem perturbed. This was yet another hearing. A practice session for the next hearing. Shielded by the veil under Section 230 of the Communications Decency Act, the CEOs would go back to their opulent offices and make more money. The parents would go on to moan their children and the Senators would continue doing what they do best. Debate.

While the Senators were hoisted in their semi-circular row of desks overlooking the CEOs, officiously berating their conduct and criticizing their algorithms, the combined annual revenues of those companies was enough to wield power like never before. Had the CEOs made a mockery of the historic significance of the hoisted semi-circular desks? Because the historical significance of that hearing was the alteration of power dynamics, the concentration of power among the Tech bros and the fragility of governance systems. When the hearing ended, I sighed knowing that it would not be the last hearing. Those parents would not be the last to bury their children. We were only witnessing the centrifugal and centripetal forces in motion.

If Big Tech CEOs could challenge a super power, what was the fate of countries like Uganda? Do these companies care about the impact of their algorithms on the minors in Uganda? When Shou Zi Chew said that Tik Tok is investing Two Billion Dollars in Trust and Safety efforts, did that account for users in Uganda? Do these companies monitor performance of their algorithms in countries like Uganda? Are they accountable to governments like Uganda’s? And what is the recourse of the victims in Uganda?

See, when we speak about concentration of power, we must equally reinforce the fragility of human beings whose voices are slowly dimming. Do the Big Tech CEOs care for some countries if their combined annual revenue by far supersedes some countries’ GDP? Do their trust and safety efforts account for the illiterate parents? Do these efforts account for the financial strain on parents that do not have time to police each of their children’s activities online? How many children in Uganda are victims of Child Sexual Abuse Material? Do they have these statistics? Do they have any intervention measures? Or is their intervention determined by the share of revenue?

It is undoubtedly true that Technology in the Twenty-first century is shifting tectonic plates, altering power dynamics and disrupting systems all around us. Yet the scale, capability and influence of these companies is only budding. While the predictions of the shifting power dynamics in the western countries is gradually being unveiled, the fragility of governance systems on countries like Uganda still remains firmly veiled. Worse still, the fate of victims remains in the dark and questions about correlation and causation are either left unanswered or not queried at all. But perhaps the more important questions are: What is the fate of countries like Uganda in the Coming Wave? What is the fate of victims in countries like Uganda? What reinforcements do governments need to hold Big Tech accountable in all their countries of operation? These questions are not exhaustive, but only a start of a discussion because online child protection is not limited to some and online safety must be available to all.

Reference

Suleyman, M., & Bhaskar, M. (2023). The Coming Wave: Technology, Power and the Twenty First Century’s Greatest Dilemma. New York: Crown.

]]>
https://joanitanagaba.com/what-is-the-fate-of-countries-like-uganda-in-the-coming-wave/feed/ 0 98
The intricate nature of the Data Protection issue on the African Continent https://joanitanagaba.com/the-intricate-nature-of-the-data-protection-issue-on-the-african-continent/ https://joanitanagaba.com/the-intricate-nature-of-the-data-protection-issue-on-the-african-continent/#respond Thu, 10 Aug 2023 09:15:00 +0000 https://ipsk.ac.ug/cv2/?p=264

I have spent the past five years in the Data Protection and Privacy field. For the most part, I have been keen on new developments in the European Union and the Council of Europe. With time, I took interest in the United States of America and went further to take an online Data Privacy and Technology course with the Harvard Business School. I recall being one of the very few people from the African continent attending this course. The course challenged me enough that I was soon prompted to read as many books in order to arrive at my own conclusion on the issue.

While I appreciated the literature, I soon realized that I had spent substantial time harnessing a foreign approach to Data Protection. I was attempting to transplant the English Oak to the African Continent and expecting it to flourish1. I was completely detached. How could my skills add value to the continent if I was this detached from the continent? As soon as I came to this realization, I decided to take an interest in Data Protection on the African continent. It became more apparent that in most parts of Africa, hunger, conflict, poverty, disease, lack of access to electricity and internet were the prevailing issues. Data Protection was not a critical issue for people struggling to find the next meal.

More recently, World coin stirred up our neighbours with a proposition to access their biometric data (Iris scans) in exchange for less than $502. From a different stand point, this may sound ludicrous. But according to the State of Food Insecurity and Nutrition Report of 20233, 14,800,000 Kenyans were severely food insecure while 38,300,000 were moderately or severely food insecure between 2020 and 2022. What incentive would they have to choose Data Protection over feeding their families? Data Protection seemed like a Global North problem.

I knew that I had to find a different approach to the problem. It was important to find the right balance between the prevailing and emerging problems. I reached out to my friend Tommy – a software developer – with a proposition on how we could, at least, solve the internet and electricity problem. Our research led us to the conclusion that we needed significant funding which we didn’t have. So, we put a pin on that. I continued to be haunted by my failure to address some of these problems; to add my brick to everyone’s efforts. I started to tinker with the idea of solving the food insecurity problem.

According to my research, out of the estimated 47,300,000 Ugandans as at 2022, 34,000,000 were moderately or severely food insecure between 2020 and 2022. This meant that only 13,300,000 could comfortably afford food. Inevitably such a population would not be concerned about Data Protection. Why would retailers worry about data practices of the tech companies that are providing cheaper avenues to access markets? I came to the rude awakening that my efforts in the past five years had been awfully out of touch with the reality in my country. There had to be an alternative approach.

I returned to Tommy with yet another proposal. This time to alleviate food insecurity. During our engagement, I proposed a solution that relies on the ubuntu concept that is still in the African people’s psyche – an online food bank. After a long call, he was optimistic that this was achievable. We needed to appeal to the Ugandan people’s empathy to share their excess food because no person should go to bed hungry through no fault of their own. In any event, the climate activists were concerned about the food waste4. This would hit many birds with one stone. I crossed my fingers and hoped it would work. As I write this article, we are in the final stages of building a Minimum Viable Product (MVP). I am confident that while this solution might not significantly move the needle, it will make a difference, however slight.

As I conclude, I am alive to the insignificance of my efforts. Therefore, I would like to appeal to the African countries to reinforce their efforts towards alleviating suffering on the continent. As long as our legs are still stuck in the muddy waters, we may find difficulty in approaching emerging issues. Secondly, I would like to appeal to governments to provide sufficient resources to Data Protection Authorities. While their mandate may not seem like an urgent need, it is equally important. We must find the right balance. Thirdly, as we build towards firmer efforts at Continental level, I would like to appeal to the regional blocks to provide an extra layer of protection to their member states.

Finally, I would like to also appeal to the African Union to strengthen their efforts towards protecting the personal data of every national of a country within the Union. This may require rethinking the current approach of the Malabo Convention and perhaps benchmarking on the effect of the GDPR5. If we wait on all countries to enact their laws and put in place robust Data Protection measures, it will take years. Time that we do not have especially as the flood gates of Artificial Intelligence open up. Without a deliberately contextualized approach, Africans remain vulnerable to tech companies who have turned this continent into a playground.

]]>
https://joanitanagaba.com/the-intricate-nature-of-the-data-protection-issue-on-the-african-continent/feed/ 0 264
Can we restrain the urge to take out our phones? https://joanitanagaba.com/can-we-restrain-the-urge-to-take-out-our-phones/ https://joanitanagaba.com/can-we-restrain-the-urge-to-take-out-our-phones/#respond Thu, 13 Jul 2023 08:22:00 +0000 https://ipsk.ac.ug/cv2/?p=246

Today as I lazily scrolled through a WhatsApp group discussion, one post stood out. A post that quoted the Late Aponye’s wife saying “While people were taking pictures of my dead husband at the accident scene and sharing them, a young girl – Barbra quickly got a katenge/kitenge and covered him. I will be forever indebted to her for that single act…” For a moment I pondered about how we have engendered a culture of sharing all manner of information without any regard. Have the algorithms hacked the human psyche? Have the algorithms hacked our sense of empathy? Have our brains been hacked?

As I contemplated these questions, I was reminded of the funeral service I attended. Having received news of my friend’s father’s death, I wanted to condole and offer my comfort – insignifcant as it may have been at the time. When I arrived at the church, I took my place at the back leaving all available front seats to the grief-stricken family. We stood, sat, bowed our heads at the direction of the church leader. Eventually she announced the speech session, hasting to add that members should keep it short. One by one the family and friends stood up reminiscing upon a remarkable man. A man who loved fervently, joked frequently and embraced his people. I was in awe. As the children stood in line to speak about their late father, I lifted my head to locate my friend. His eyes red, every now and then forcing a smile whenever his brother recounted another one of his late father’s jokes. I saw his pain, I felt his pain because seven years prior I stood at the same pulpit looking over at the casket that held my childhood friend. I knew the vulnerability because I had experienced it. The loss of a loved one. While I shared his grief, I wondered whether I had intruded upon this intimate moment. A moment where family and friends grieve and comfort one another. A moment where we are all at our most vulnerable. An event that tends to either strengthen or tear the tapestry that holds a family. This was a private event and I was a spectator – an intruder. I had not earned the right to sit at the back of that church and listen in as the family recounted their tales with nostalgia. But I could not leave. And so I stayed until the chief mourner led us in a prayer that reminded us of the evanescence of life. As I walked out, I knew I had to see my friend, make haste and allow the family to grieve privately. And so I did.

Many of us have shared this experience. I wonder if it has occurred to us that social media is inviting us into places that we should not be? That social media has enabled us to overly share intimate and private information. Like Barbara, can we restrain the urge to take out our phones? Can we summon the remnants of the empathy that still lies in our psyche? Can we locate the nearest pall and veil such moments wading off spectators? Can we resist the urge while we still can? Because every moment we take out these surveillance devices during such intimate moments, we are not the only spectators. We are inviting an avalanche of spectators that surreptitiously follow us everywhere with voyeuristic pleasure. To them we are no longer humans, we are subjects. Data Subjects. They have offered us their “genius” platforms for free and in return they demand our valuable personal data. The oil of the twenty first century – they say. But at what cost? At whose cost?

]]>
https://joanitanagaba.com/can-we-restrain-the-urge-to-take-out-our-phones/feed/ 0 246
What have we done? https://joanitanagaba.com/what-have-we-done/ https://joanitanagaba.com/what-have-we-done/#respond Mon, 19 Jun 2023 08:16:00 +0000 https://ipsk.ac.ug/cv2/?p=244

After a long day, I like to listen to white noise because of its calming effect and often it whisks me away into slumber. More recently, Spotify has been a good companion for my mental nourishment. However, this process is often disrupted by the incessant and loud adverts defeating the purpose of whisking one to sleep. Why would Spotify present me the idea of better sleep all while trying to keep me awake. It is counterproductive. As I write this article, it’s a half past midnight and sleep eludes me.

Spotify collects my personal data, aggregates and profiles me. To this extent, I am not aware who else has an interest in my personal profile and who has accessed this particular profile. Why would Spotify keep me awake in an effort to sell its premium subscription. This is grossly unfair especially because of the opaqueness of our relating and the imbalance of power between us. Was Spotify creating a class system where serenity was priced at a cost. Have we designed a system where peace and tranquility are only available to the rich and famous?

And who thought this advertising method was a brilliant idea? Try and visualize this in our physical world. It sounds like someone is always lurking in the nearby bush waiting to advertise. Even in your sleep. This is a menace. We must design a digital environment that not only mirrors our physical world but is positively amplified by the power of Artificial Intelligence.

With all the problems that plague our planet, we ought to rely on advanced technology to solve problems on planet earth. Not attempting to keep us up all night. As I listened to the paternal and avuncular voice of David Attenborough as he narrated, Our Planet, I was quite surprised at the near similarity between the climate sustainability cause and Artificial Intelligence Ethics. He said, “All across our planet, crucial connections are being disrupted. The stability that we and all life relies upon is being lost. What we do in the next 20 years will determine the future for all life on earth.”

Whereas my rant against the disruptive nature of Spotify adverts may seem far too insignificant to cause any dent on Planet Earth, this article is about the mayhem we have created despite being enamoured with advanced technology. In the last couple of months, we have seen an upsurge of AI powered chatbots, Image generators, Audio generators and more. In many ways these technologies are amplifying our research, work, communication, interaction and much more. These tools are creating the much-needed efficiency.

However, unknown to most, some of these tools are scraping large amount of personal data off the internet violating a host of laws especially the right to Privacy and Data Protection. To protect themselves, these companies have generated self-regulating privacy policies as a window into their world. In turn we have created a fragmented data protection environment where each country has a regulatory framework. New laws are frequently passed. It is a regulatory mayhem and it’s hard to keep. I wonder if there is someone out there who sits back and asks, “What have we done?”

These tools are also being used as agents of a far more discriminatory world than the one we currently live in. In some countries, these tools are used to police people of colour, deny them financial assistance, employment, even relationships. Using remote AI biometric Identification systems, these tools have created an Orwellian society and it is only just the beginning.

We are at a vantage point- a twilight- where we have an opportunity to create change that reverberates throughout the entire Planet. Not only in Europe. Google recently postponed the launch of its AI powered chatbot in Europe because of the protective European regulatory regime. But every human on this planet deserves protection from the adverse impact and/or potential impact of Artificial Intelligence. We can create privacy preserving and ethically bound tools. We should because “All across our planet, crucial connections are being disrupted. The stability that we and all life relies upon is being lost. What we do in the next 20 years will determine the future for all life on earth.”

While Max Tegmark, Eliezer Yudkowsky and other scholars address issues in the distant future, we need to address the current problem. Our AI regulatory framework should pick a few lessons from the regulatory mayhem under data protection regime. We need an international governing framework that addresses AI problems at a planetary scale. Not a pigeonholed system imbued with legal uncertainty.

]]>
https://joanitanagaba.com/what-have-we-done/feed/ 0 244
Can we build trust instead of the empty expectations of data subject control? https://joanitanagaba.com/can-we-build-trust-instead-of-the-empty-expectations-of-data-subject-control/ https://joanitanagaba.com/can-we-build-trust-instead-of-the-empty-expectations-of-data-subject-control/#respond Tue, 03 Jan 2023 08:48:00 +0000 https://ipsk.ac.ug/cv2/?p=252

Every day we make new memories. Because we have freely available cloud space to store them and yet we have little to no time to reminisce upon and rehash these memories, we have become data hoarders. We have been encouraged to look forward to new memories, to buy new gadgets with more internal space and even more advanced cameras. We have been set into this kaleidoscopic motion; never stopping to ask an important question — why? A question that must be contextualized; a question that must be answered in different time and space.

Companies have told us that they are required by their policies (a commitment of aspiration between them and their users) to delete this data over a period of time; that we can always submit requests to delete and correct this data; that we can report them to supervisory authorities. But let us be practical about a few of things. Very few people keep track of whether these companies are compliant. It would be, even more, unreasonable to expect most under-resourced supervisory authorities to track the compliance requirements of all companies.

This approach lacks congruence– the web permeates borders but the law has been confined and restricted to its progenitor’s borders; technology (tech) companies are leveraging advanced systems while supervisory authorities are relying on traditional methods of implementing the law. But even more concerning is the power imbalance between data subjects and the tech companies.

We have been assuaged with assurances that we shall have control over our data but what does control mean? Is this control fixed in time and space? Why does control over personal data only come with burdens but not benefits? Black’s Law dictionary defines “control” to mean “To exercise power or influence over. More importantly, we must ask ourselves whether we actually have the power to exercise or even have any influence over the data practices of these tech companies.

To have control is to have visibility into present and future data practices but we don’t have visibility. How then do we, the users, have control? To have control is to be enlightened about the value of the subject matter being controlled. How can we casually assign control to persons that are not as digitally literate? Africa, for example, heavily relies on imported tech devices yet the population forms part of the least digitally literate with little to no understanding of these compliance requirements often designed in voluminous text.

The concept of control over data is slowly proving to be ever more unreasonable and perhaps unattainable — ceteris paribus. We have been burdened with the duty to take control over our personal data without the benefit of fully exercising control over this data. In one Ugandan case — Aida Atiku Vs Centenary Bank (HCCS 754/2020) — the plaintiff, an elderly woman, lost millions of Uganda Shillings she entrusted with the bank for safe custody, in turn, the court ruled that the Plaintiff ought to have taken steps to ensure that her personal data was not accessed. We have been told to curate the right passwords for all our different online accounts and are expected to remember all these passwords — else this might result in contributory negligence.

Why doesn’t the scope of control extend to commercial benefits of this data. Control over personal data should equally give us the choice to profit from access to personal data. This, largely, speaks to fairness. Let us make a comparison. In the 1950s a lady called Henrietta Lacks was diagnosed with cervical cancer. Eventually she died and her cells were harvested without her consent. The cells were used to develop some of the most monumental advances in medical history with pharmaceuticals and other companies profiting from these cells yet her family could hardly afford medical care.

On the other hand, Ted Slavin — a haemophiliac — was suffering from bouts of hepatitis B when his doctor told him that his body was producing something valuable. Coincidentally researchers around the world were working to develop a vaccine for hepatitis B, and doing so required a steady supply of antibodies like Slavin’s, which pharmaceutical companies were willing to pay large sums for[1]. Slavin sold his serum to anyone who wanted it. But because Slavin hoped to cure Hepatitis B, he later entered into a partnership with Baruch Blumberg, to freely access Slavin’s blood. It was Baruch that later discovered and developed the first hepatitis B vaccine.

Why don’t users have a bona fide choice to determine what this data can and will be used for? Customers have a choice to determine which products to buy based on the companies’ practices including child labour, environmental practices, to say the least. And yet a data subjects control over personal data seems to be confined. But to have control is to have freedom to choose and determine.

The importance of choice in determining and influencing how personal data is used cannot be over emphasized. Sharing personal data is highly risky — a fact that has been underscored, deliberately or otherwise. For instance, phone companies have allowed us to unlock our phones using facial recognition but this technique, efficient as it might be, comes with the risk of relinquishing one of the most vital aspects of our bodies.

Might you wonder whether advances in 3D technology might allow anyone to print a person’s facial features giving them access to any site that requires facial recognition biometrics? Might you wonder whether these facial features could be used in deep fakes? Might you wonder what the tech companies might use these features for in the long term — 100 years from now? What happens to this data in the next generation and what will the data be used for? Can one bequeath their rights to this data? Can we trust the purpose indicated in privacy policies? What happens when the purpose changes and the data subject is long dead? The more we are eluded into having control, the more we lose this control. These are the conundrums we must address when assigning control over personal data.

The spectrum within which we view the notion of control should be widened, both in time and space. Blanketly assigning control creates empty expectations and misperceived power over personal data. We must enlighten the masses about the value of this data but more importantly, protecting personal data must go beyond compliance and risk mitigation. We must foster data protection as a social value. We must advocate for fairness, equity, transparency and such ethical values to guide the collection, processing and use of personal data.

The current approach to control over personal data is breeding a contest between users and tech companies — us against them. This has resulted in companies finding ways to develop more esoteric, obscure and indiscernible ways to collect and process this data. Yet the data revolution continues to be greased with our data. Some of the greatest solutions to humanity’s problems will be solved by relying on vast amount of data. Therefore, we must appreciate the shared interests and build trust along these lines.

]]>
https://joanitanagaba.com/can-we-build-trust-instead-of-the-empty-expectations-of-data-subject-control/feed/ 0 252
Africa must standardize personal Data Privacy as a social norm https://joanitanagaba.com/africa-must-standardize-personal-data-privacy-as-a-social-norm/ https://joanitanagaba.com/africa-must-standardize-personal-data-privacy-as-a-social-norm/#respond Fri, 03 Jun 2022 08:43:00 +0000 https://ipsk.ac.ug/cv2/?p=250

As early as the 17th Century, Arthur Schopenhauer posited that a man can be himself only so long as he is alone for it is only when he is alone that he is free. Freedom of thought and speech have always been at the core of a robust society and to uphold these rights is to protect the right to personal data privacy. The diversity and depth of thought in a society hangs in the balance if our people lose trust in the digital realm; the fear that their personal data and opinions could be used — disadvantageously- against them.

In a Tedtalk, Glenn Greenwald has expressed this concern by stating that to be a fulfilled human being is to have a place we can go free of the judgmental eyes of other people. When we are in a place where we are being watched, our behaviour changes dramatically. People demonstrate conformist and compliant behaviour when they know they are being watched — essentially removing all authenticity that makes a robust society. We must then consider the duty to protect personal data as a social norm; a value that should be weaved in every aspect of our lives — an ethical culture that must propagate our private and public lifestyle.

On the one hand, when we speak about the legalities of personal data privacy and the requirements under the various and I mean innumerable laws and regulations, we tend to lose sight of the very essence of protecting personal data. On the other hand, we may attribute the casual and lax approach to a lack of understanding of the magnitude of the issue at hand. Our African people continue to fall prey to a lot of misconception that has been carefully choreographed by technology companies — “Who needs privacy in the 21st Century?”. But we do need privacy in this century as much as we did in the previous centuries and beyond.

Africa is home to the youngest population, a population that is gradually onboarding different forms of technology. A trajectory that is on the rise as prices of mobile smartphones gradually slump and a commitment to enhancing access to electricity and internet connection. These efforts are evidenced by some government’s efforts. For instance, the government of Uganda under the Ministry of Information Communications Technology through the National Information Technology Authority of Uganda launched the National Data Transmission Backbone Infrastructure and e-Government Infrastructure Project (NBI/EGI) whose goal is to connect all major towns to the Optical Fibre Cable based Network.

As a continent, we still have the opportunity to build and cultivate an ethical and privacy compliant digital realm. A realm that leverages data as a strategic business asset but more importantly on an ethical and data privacy compliant ecosystem. An opportunity that will become more profound as we usher in the African Continental Free Trade Area; a time when transborder data transfers will be prevalent.

It, therefore, behooves us to take the initiative to determine the society and business ecosystem that is pegged on trust, certainty, credence and upholds our rights to freedom of speech, freedom of thought among others. Building such a system will require governments to take on the responsibility of carefully crafting, infusing and standardizing personal data privacy as a social norm.

The Author is the Co-founder of ANJ Data Management Solutions Africa Ltd

]]>
https://joanitanagaba.com/africa-must-standardize-personal-data-privacy-as-a-social-norm/feed/ 0 250
Post pandemic governance: A time to re-learn, re-think and re-engineer governance, leadership and management. https://joanitanagaba.com/post-pandemic-governance-a-time-to-re-learn-re-think-and-re-engineer-governance-leadership-and-management/ https://joanitanagaba.com/post-pandemic-governance-a-time-to-re-learn-re-think-and-re-engineer-governance-leadership-and-management/#respond Mon, 01 Mar 2021 08:50:00 +0000 https://ipsk.ac.ug/cv2/?p=254

A friend recently reached out to me distressed about the ongoing governance issues at his company. The Board of Directors was at war with management which in turn had a ripple effect on the rest of the teams. As we speak, most of the company’s employees are out job hunting. “No one wants to be here when it all comes down crumbling”, he said. I thought to myself, this is just one among the many companies that have been rattled by the profound changes brought upon by Covid 19. I wondered what would become of corporate governance; was it time to re-think and re-engineer? What skillset is required to manage and lead during such times? Was it time to develop a whole new set of rules to manage companies? Would we come back from this?

The pandemic has created a period of high stress for all company stakeholders; an unprecedented volatile, uncertain, complex and ambiguous (VUCA) ecosystem. Now more than ever companies look to robust governance, leadership and management. As leaders continue to navigate these uncertain times, it is clear that the north star will continue to be the principles of corporate governance; Fairness, Accountability, Responsibility and Transparency. Whereas the keys to being an effective and efficient leader have previously been creativity, communication, integrity, resilience to mention but a few, the pandemic will prompt leaders to cultivate more empathetic and compassionate leadership; to develop resilient governance architectures; policies, procedures, systems and structures — to look at governance through a fundamentally different prism.

Now more than ever, today’s leadership and governance requires a fine balance between profit maximization and strategic stakeholder management. Naturally, during uncertain times humans gravitate towards self-preservation and this means many and diverse things to different stakeholders. All these dynamics not only require change management skills but like a skilled marionettist, a leader must evenly balance the needs of the entire ecosystem. What is prevalent is that strategic and effective communication will be at the helm of managing during the new normal — whatever that means. Building back better will not only require more communication but also different ways of communication taking into account technology and the new ways of working.

As technology continues to form a part of every facet in companies, companies will need to zero in on data privacy strategies. Data Protection and Privacy is at the core of responsible use of technology and leaders must embed a data privacy culture to better ensure risk management and compliance requirements of the companies. This will require capabilities to merge corporate governance strategies with data privacy governance strategies. The early entrants to fostering data privacy cultures will inevitably better manage the ongoing technological revolutions coupled with the uncharted waters of this pandemic.

The pandemic has gradually acclimatized people into remote work as board and other collaborative meetings are being held virtually. Leaders have to develop new and more effective ways of managing boardroom dynamics. This challenge will be more amplified for public companies but equally crucial with private companies. Whereas companies have been accustomed to less frequent and more scheduled board meetings and reporting, there may be a need for more board meetings and more board involvement. This is due to the fact that the role of a Chief Executive Officer (CEO) is burdensome enough for one person more so in these uncertain times. The pandemic needs all hands on deck; more leadership than management; more board involvement and more board oversight. Going forward, companies may have to be managed by more than one CEO, a strategy that some companies have already taken up.

Unfortunately, the pandemic has also increased the gap between women and men — a retrogressive reaction to the problem. More women have either been laid off or they have resigned due to the difficulties they face in balancing family and work. The discussion regarding women on boards and women in senior executive positions has to go beyond appointing more women but also provide more conducive working environment for women because the aim should be more than equality but rather equity. The dynamics have changed from simply providing breastfeeding areas and office child care to remote solutions. Such work cultures will ensure that we don’t retrogressively discriminate women (unintentionally or intentionally).

In conclusion, as corporate leaders steer through the storm and whirlwinds, it is important to be reminded that building trust through empathetic and compassionate leadership is extremely crucial. Persistent and effective communication is equally important; veering away from voluminous documentation and improvising with technology and videography. Corporate leaders have to re-learn, re-think and re-engineer new ways of leadership and governance because unfortunately for the them the pandemic occurred simultaneously with groundbreaking technological shifts that are equally shifting the tectonic plates underneath corporations.

]]>
https://joanitanagaba.com/post-pandemic-governance-a-time-to-re-learn-re-think-and-re-engineer-governance-leadership-and-management/feed/ 0 254
Why companies should prioritize data privacy. https://joanitanagaba.com/why-companies-should-prioritize-data-privacy/ https://joanitanagaba.com/why-companies-should-prioritize-data-privacy/#respond Mon, 22 Feb 2021 09:02:00 +0000 https://ipsk.ac.ug/cv2/?p=256

A

s Data subjects become more privacy-centric, it can only be expected that corporations whose core belief is that the customer is always right and cash is king should reciprocate their customers’ needs. Evidently there has been a shift in customer behaviour as users have opted for privacy focused applications like Signal, Telegram and DuckDuckGo. DuckDuckGo is a privacy focused search engine that neither tracks users’ searches nor shares users’ personal data with third parties.

Companies must make a shift towards bridging the gap between organizational strategy and personal data protection. Some of the companies that have met the wrath of privacy fines for failing to make this shift include Walmart, British Airways, Marriott International Inc. and Google.

  • Walmart is settling a class action suit of $10M with its Illinois employees having violated the Illinois Biometric Information Privacy Act after its employees claimed that the company used their biometric data from its palm scanning device without their consent.
  • British Airways was fined 20M Pounds for failing to protect the personal and financial details of more than 400,000 of its employees.
  • The French Data Privacy regulator fined Google 50M Euros because it failed to provide enough information to users about its data consent policies and did not give them enough control over how their personal data is processed.
  • Marriott International Inc. was fined 18.4M Pounds for failing to keep millions of customer’s personal data secure.

The peculiarity of the Marriott case should raise alarms to corporations conducting Mergers & Acquisitions. In 2014, cyber attackers hacked into Starwood Hotel’s system which resulted in a privacy breach of several of its guests. The attack went undetected until 2018 after being acquired by Marriott in 2016. This case should stand as a reminder for professionals conducting due diligence to take into account data privacy standards of companies that they intend to merge with or acquire.

Under the GDPR, fines for non-compliance with the regulation are up to 20M Euros or 4% of the entity’s annual global turnover, whichever is greater. The far reaching hand of the GDPR has informed laws and regulations across the globe hence most fines fall within that range. Granted, the myriad and fragmented Data Protection Laws across the globe have created a complex data privacy compliance framework especially for multinationals. . However, just as companies have managed to formulate thriving businesses models across borders, they must equally design data privacy models that are equally befitting.

According to the Implementation and Compliance Guide by the IT Governance Privacy Team, “the prerequisites for implementing a complex compliance framework are knowledge and competence”. Compliance with the diverse laws may be fostered by engaging dedicated Data Protection Officers/Managers in the different countries to keep track of regulations and enhance compliance. More importantly, prioritizing and leveraging on personal data privacy will foster a data privacy culture making it easier to comply with laws and mitigating personal data breach risks.

Yuval Noah Harari postulates that the 21st Century is ushering in a new religion, Dataism, which “declares that the universe consists of data flows, and the value of any processing phenomenon or entity is determined by its contribution to data processing”. He suggests that the new era will be woven by the Internet Of All Things as everything and anything will be plugged into the new system. Therefore, as we journey along, it is fundamentally crucial that we all cultivate a culture that fosters protection of personal data because, contrary to popular belief, the rise of data privacy is not meant to curtail innovation but rather to foster responsible use of technology.

The author is a Lawyer, Data Privacy Practitioner, Member of the IAPP and CO-founder of ANJ Data Management Solutions (A) Ltd.

]]>
https://joanitanagaba.com/why-companies-should-prioritize-data-privacy/feed/ 0 256
Managing Personal Data Breaches https://joanitanagaba.com/managing-personal-data-breaches/ https://joanitanagaba.com/managing-personal-data-breaches/#respond Sat, 20 Feb 2021 09:05:00 +0000 https://ipsk.ac.ug/cv2/?p=262 Introduction

The fluidity and intricate nature of data makes it particularly complex to manage personal data breaches due to the volume, velocity and variety of data churned out in today’s digital ecosystem. This complexity is further amplified by the varied scope of personal data breaches. As organizations embark on the journey to demystify personal data, they must disambiguate security incidents from personal data breaches and adhere to the data protection principle of integrity and confidentiality. Whereas there isn’t a one size fits all approach to managing personal data breaches, this paper will enlighten the reader on how to manage these breaches under two legal frameworks; the General Data Protection Regulation (GDPR) and Uganda’s Data Protection and Privacy Act, 2019 (DPPA). The author appreciates the far reaching hand of the GDPR and recommends that in addition to adhering to the domestic law, organizations should strive to adhere to the GDPR — a regulation that has been acclaimed as a yardstick for international best practice.

What is a personal data breach?

Recital 87 of the GDPR states that when a security incident takes place, quickly establish whether a personal data breach has occurred and promptly take steps address it. The regulation contemplates that not all security incidents result in personal data breaches which then makes it critical to define a personal data breach. A personal data breach means a breach of security leading to accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data transmitted, stored or otherwise processed which may in particular lead to physical, material (e.g. financial loss) or non-material damage (e.g. identity theft). Broadly defined as a security incident that affects the confidentiality, integrity and availability of personal data.

In order to address a breach, a controller should be able to recognize whether the breach is one of confidentiality, integrity or availability. A confidentiality breach occurs where there is an unauthorized or accidental disclosure of or access to personal data. An integrity breach occurs where there is an unauthorized or accidental alteration of personal data and an availability breach occurs where there is an accidental or an unauthorized loss of access to or destruction of personal data.

Personal data breaches may emanate from security incidents such as ransomware attacks, data exfiltration attacks but they may also emanate internally where employees accidentally send emails to wrong recipients or in instances where devices containing personal data are lost or misplaced. A personal data breach is not limited to digital platforms but may also occur if documents containing personal data are misplaced or posted through snail mail to a wrong recipient. What is fundamentally crucial is the ability to identify when a personal data breach has occurred so as to determine what steps should be taken to address or mitigate the breach taking into account the nature, scope as well as the risks and severity to the rights and freedoms of data subjects.

Duty to notify the supervisory authority

The GDPR introduces a requirement to notify the supervisory authority of personal data breaches. Article 33 of the GDPR states that in case of a personal data breach, the controller must without undue delay and where feasible, not later than 72 hours after having become aware of it, notify the supervisory authority, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification is not made within 72 hours, it must be accompanied by reasons for the delay.

It is notable that under the regulation, not all personal data breaches warrant notification to the supervisory authority. The Data controller has the discretion to determine whether the breach will result in a risk to the rights and freedoms of the data subjects. Such discretion must be exercised judiciously. For example, if an employee accidentally deletes a client’s personal data but this data can be accessed from the company’s back up, the company may not need to report this availability breach to the supervisory authority.

Uganda’s DPPA, on the other hand, imposes a duty on the Data controller, Data Collector and Data Processor to immediately inform the Authority (National Information Technology Authority) of any data breach. The Act removes any room for ambiguity — all breaches must be brought to the attention of the Authority.

The notification to the supervisory authority should at least;

· describe the nature of the personal data breach including where possible; the categories and approximate number of data subjects concerned; the categories and approximate number of personal data records concerned;

· include the name and contact details of the Data Protection Officer or other contact point where more information can be obtained;

· describe the likely consequences of the personal data breach; and

· describe the measures taken or proposed to be taken by the controller to address the personal data breach including where approximate measures to mitigate its possible adverse effects.

The GDPR acknowledges that this information may not be available to the Data Controller and allows the Data Controller to provide the information in phases without undue further delay.

Duty to notify the data subjects

Article 34 of the GDPR states that where the personal data breach is likely to result in a high risk to the rights and freedoms of natural persons, the controller must communicate the personal data breach to the data subject without undue delay. Such breaches may, if not addressed in a an appropriate and timely manner, result in physical, material or non-material damage to the Data Subjects such as loss of control over their personal data, discrimination, identity theft, financial loss, damage to reputation, loss of confidentiality of personal data protected by professional secrecy or any other significant economic or social disadvantage to the natural person concerned. Once again, under the GDPR, the controller has the discretion to determine whether the Data Subject should be informed. For example, if a bank’s system is hacked and their clients’ financial data are exfiltrated, the bank has an obligation to notify the Data Subjects without undue delay.

The communication to the data subject must describe, in clear and plain language, the nature of the personal data breach and contain;

· the name and contact details of the Data Protection Officer or other contact point where more information can be obtained.

· describe the likely consequences of the personal data breach.

· describe the measures taken or proposed to be taken by the controller to address the personal data breach including where approximate measures to mitigate its possible adverse effects.

The notification has to provide sufficient information to allow the data subject to take protective measures against the consequences of the unauthorized access. However, communication to the data subject is not required if any of the following conditions are met:

· The controller has implemented appropriate technical and organizational protection measures and those measures have been applied to the personal data affected by the breach, in particular, those that render the personal data unintelligible to any person who is not authorized to access it such as encryption.

· The controller has taken subsequent measures which ensure that the high risk to the rights and fundamental freedoms of data subjects is no longer likely to materialize.

· It would involve disproportionate effort. In such a case, there should instead be a public communication or similar measures where by the data subjects are informed in an equally effective manner.

On the other hand, according to the DPPA, it is the Authority which determines whether the Data subject should be notified. Where the Authority determines that the Data Subject should be notified, then the notification must be made by either of the following methods:

· Registered mail to the data subject’s last known residential or postal address;

· Electronic mail to the data subject’s last known e-mail address;

· Placement in a prominent position on the responsible party’s website; or

· Publication in mass media.

Exercising the discretion under the GDPR may be somewhat perplexing for Data Controllers. Nonetheless, Data Controllers must at all times keep and maintain a register of all personal data breaches whether or not they were reported. It may help to have an internal measurement framework for determining a risk that warrants reporting to the Supervisory Authority or a high risk that warrants reporting to the Data Subject or both.

Technical and organizational measures

One of the most important obligations of the data controller is to evaluate risks and implement appropriate technical and organizational measures to address them. According to Article 5 (f) of the GDPR, personal data must be processed in a manner that ensures appropriate security of the personal data including protection against unauthorized or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organizational measures.

Section 20 of the DPPA imposes a duty upon the data controller, data collector and data processor to secure data subjects’ personal data in line with the integrity and confidentiality principle of data protection. Below are some measures that may be employed to manage or mitigate personal data breaches:

· Robust data protection impact assessments will facilitate identification of reasonably foreseeable internal and external risks to personal data.

· Pseudonymization and encryption of personal data.

· E-mails should be sent under blind carbon copy (bcc).

· Disable auto complete when typing e-mail addresses.

· Encourage employees to double check files before sending them.

· Ensure that all mobile devices containing personal data have very strong passwords.

· Turn on mobile device functionalities that enable them to be located in case they are lost.

· Employ multi- factor authentication methods.

· Ensure that you have robust breach detection, investigation and internal reporting procedures.

· Training and awareness on data protection issues focusing on personal data breach.

· Keep all documents containing personal data in secure locations.

· Establish proper access control policies and procedures.

· Look out for unusual data flows.

· Have systematic IT security audits.

· Have plans, procedures in place for handling eventual data breaches.

· Have a contingency plan to deal with subject access requests and erasures.

· Controllers should not transfer personal data to Processors unless and until they have determined that the Processors have in place technical and organizational measures to protect the personal data.

Organizations must employ a finely balanced approach to managing these breaches i.e. both a risk based approach and a compliance based approach. Compliance teams, cybersecurity teams and data governance experts must offer effective and efficient leadership to enable teams identify and report personal data breaches.

Conclusion

As earlier noted, there isn’t an all-encompassing approach to managing personal data breaches. However, it is important that organizations cultivate a data protection and privacy culture that is deeply woven into the fabric of the entity so as to enable strategic management of personal data breaches. This will not only facilitate agility in identification of the breaches but also enhance mitigation efforts. Managing these breaches cannot be over emphasized. Not only because it is a compliance requirement but because failure to manage personal data breaches may result in criminal sanctions. The DPPA imposes a fine of Ugx 4,800,000 or imprisonment for 10 years or both for unlawfully obtaining or disclosing personal data. It is equally an offence to unlawfully destroy, delete, conceal or alter personal data with a penal sanction of a fine not exceeding Ugx 4,800,000 or imprisonment not exceeding 10 years or both. The author recommends that the Uganda legal system should develop guidelines to enable Data Controllers, Processors and Collectors benchmark when in doubt. In the interim, the European Data Protection Board guidelines regarding Data Protection Breach Notification may offer the much needed guidance.

The author is a Lawyer, Data Privacy Practitioner, Co- founder of ANJ Data Management Solutions (A) Ltd and a Member of the IAPP.

]]>
https://joanitanagaba.com/managing-personal-data-breaches/feed/ 0 262