John Bicknell, Author at Information Professionals Association https://information-professionals.org/author/johnmorecowbellunlimited-com/ Bringing together experts in cognitive security Sat, 10 May 2025 16:48:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://information-professionals.org/wp-content/uploads/2019/09/cropped-favicon-32x32.png John Bicknell, Author at Information Professionals Association https://information-professionals.org/author/johnmorecowbellunlimited-com/ 32 32 169067612 Review of Behavioural Conflict: Why Understanding People and Their Motives Will Prove Decisive in Future Conflict https://information-professionals.org/review-of-behavioural-conflict-why-understanding-people-and-their-motives-will-prove-decisive-in-future-conflict/ Sun, 27 Apr 2025 21:38:33 +0000 https://information-professionals.org/?p=16789 by Michael Williams “If you haven’t read hundreds of books, you are functionally illiterate, and you will be incompetent, because your personal experiences alone aren’t broad enough to sustain you.” […]

The post Review of Behavioural Conflict: Why Understanding People and Their Motives Will Prove Decisive in Future Conflict appeared first on Information Professionals Association.

]]>
by Michael Williams

“If you haven’t read hundreds of books, you are functionally illiterate, and you will be incompetent, because your personal experiences alone aren’t broad enough to sustain you.” – former Secretary of Defense James Mattis

 

I’ve had this book on my ‘re-read’ list for the past year. When I was heavily involved in reviewing our information activities in Iraq and Afghanistan while in my Office of the Secretary of Defense (OSD) advisory role, I always valued the opinions of our United Kingdom (UK) colleagues, who could often clearly see where we were going wrong.  Why read (much less write a review of) a book written about events of 15-20 years ago and centered on conflicts that, increasingly, are criticized for mismanagement, bungled leadership, and lack of clear strategic planning?  I re-read this book because I’m worried we seem to be burying our past and we seem to be shying away from learning from our mistakes.

There is much to learn (re-learn?) from a review of what happened 15 years ago.  Have we integrated lessons learned into our planning and execution of Operations in the Information Environment (OIE)?  More importantly, have we learned to adapt to the modern battlefield at the speed of current change? I didn’t know it when I read this book in 2012, but with the benefit of hindsight, I certainly know now that adaptation to the environment in which you’re fighting is the most important quality we need in leaders on the modern battlefield.

So, what is this book about?  In chapter after chapter, the authors have identified a skill endemic to success in all conflicts:  The ability to examine assumptions, learn from mistakes, and boldly define what is needed in the face of resistance from those above and below you.  It would be easy to say MacKay and Tatham were trailblazers in every instance, but they did identify leaders that had the qualities necessary to win in modern conflict.

I’ve learned a lot from my mistakes and successes, but far more from the former, and I think the authors would agree.  This book, however, doesn’t dwell on mistakes.  Retired Major General Andrew MacKay and retired Commander Steve Tatham identify the primary role of an OIE planner and operator is to understand the decision making of the adversary AND any individual or group that can affect the outcome of a military operation.  I don’t think Tatham or MacKays are outliers in this regard.  Both have a natural curiosity about human decision-making and use that curiosity to determine the motivations of both the adversary and those that can affect the outcome of military operations.  This curiosity is a prerequisite to being a successful OIE planner and operator.  

This is also not a book for pointing out the shortcomings of doctrine or military capabilities, though the authors point out the lack of both early on in Afghanistan and Iraq operations. For each conflict highlighted in the book, whether in the Balkans, Sierra Leone, Lebanon, Iraq, or Afghanistan, there was always a shortfall of information about decision-making by the combatants. There is no manifesto for making over Psychological Operations (PSYOP), the military planning process or, God forbid, the military acquisition process.  The PSYOP Target Audience Analysis (TAA) process comes in for high praise as a model to be adopted by all military forces that find the population is indeed part of the military terrain (and is likely to be in any future conflict).  But before the TAA is put to work, we need planners curious about what motivates combatants and others and what information they consume to inform their decisions.

 

———–

 

The nature of war has not changed; it remains as it has since conflict became the domain of professional soldiers and perhaps even before that:  Bending the enemy to your will.  To do that, of course, you must identify the ‘enemy’ and if only it were a matter of pointing out people wearing the ‘wrong’ uniforms.  That’s less likely to happen now than ever before.

It would be easy to write off our information-related efforts in Iraq and Afghanistan as flawed efforts rooted in a flawed strategy and certainly there’s some of that discussed in the book.  Again, this is not a book about failed strategies.  MacKay and Tatham spend much of their analysis on the two major conflicts of the past 20+ years but just as importantly point out the lessons learned in the Balkans, Sierra Leone, Libya, Lebanon and Gaza.  Like myself, they took experiences and lessons from previous conflicts, and began to see that human decision-making is subject to the same heuristics regardless of the continent on which you may find yourself.  What has changed over the past 30 years is the ever-increasing volume and speed of travel of information to ubiquitous handheld devices. Every conflict cited was a reminder that all military operations will eventually find this thing called the information environment carrying information between individuals and groups that affects the decision-making not just of adversary forces with military weapons but influences the decision-making of anyone that can affect the outcome of military operations.  Again and again, MacKay and Tatham identify unexpected actors able to influence the outcome of operations in completely unexpected ways.  Though, why should we have been surprised?

“As Al-Qaeda’s master strategist Ayman al-Zawahiri once observed: ‘We are in a battle, and more than half of this battle is taking place in the media.”

The authors assert that “understanding behaviour deserves greater resonance and involvement in the contemporary operating environment.”  This now seems obvious to anyone who has experienced conflict in the past 25 years, yet our military formations devoted to this understanding have shrunk, and we now look at cyber operations as the nucleus of our information activities.  I’m not the only one wondering how we’re going to orient cyber forces to execute influence operations, but this was not a worry of Mackay and Tatham in 2011.

My own experience in Afghanistan was certainly shaped by my planning and execution of information activities in the Balkans.  When I arrived in Afghanistan in early Spring of 2002, my team and I had only left Kosovo a little more than three months prior.  Our efforts in Kosovo focused on local community leaders in the ethnic conclaves, from which we learned much about decision-making among the former warring parties.  Given the relative calm (at the time) in Afghanistan and with the encouragement of the parallel special operations command which had also found this method to be very useful, we set out to get our leaders out to see local Afghan leaders particularly those that had returned to Afghanistan from the secure and comfortable expat lifestyles abroad encouraged by the nascent Afghan central government.  This was to be an eye-opening experience.  On one occasion, we met with a regional governor of a province on the Afghan-Pakistan border.  Our party was led by the Task Force Chief of Staff (at the time, the 10th Mountain Division was the core of the American forces on the ground).  After pleasantries, we started down a list of talking points prepared by the staff.  High on the list of points the senior leader I was supporting wanted to ask this particular governor was the matter of working with former Taliban (as characterized by task force intelligence sources).  Our communication with this governor was excellent as he was a fluent English speaker brought back from his expat life in the west to re-build a provincial government.  The governor was quite taken aback and was very articulate in his response, which remains burned into my brain: “If we did not work with those whom you consider former Taliban, we would have no one in the Province to work with.”  My senior leader was left speechless momentarily and quickly moved on, but it was tremendously revealing about our failure to understand the motivations and decision-making of the population – a failure that would continue to haunt us for at least another 10 years, by which time we had made many mistakes in evaluating the Afghan population’s decision-making.

MaKay and Tatham cited similar difficulties in evaluating the poppy-growing farmers of Helmand Province.  Our leaders in our respective capitols saw this behavior as simply one to be eradicated.  Yet, to Pashtun farmers, this was not about picking a side between the Taliban or ISAF.  This was about having enough money to feed and care for a family, and taking away that source of income and being told to ‘grow wheat’ was a stark reminder that we could inadvertently influence decision-making in a way that was devastating to our ability to achieve military objectives.  

In another Afghanistan example, the authors highlight the ever-present use of the term “democracy” in our communications with senior leaders and the population, but in a society with no history of democratic governance, the message often had no meaning to the receiver and thus influenced none of their behavior.  MacKay and Tatham suggest we need a “granular understanding” of audiences that can affect the outcome of military operations and hinder achievement of theater and strategic objectives (and again, they highlight the excellent TAA process as a curative tool).

The book highlights the UK Ministry of Defence guidance to UK forces in Iraq as referencing the “trinity of democracy, liberation, and freedom,” but when transmitted to the Iraqi population, the feedback implied the populace believed they were to be liberated by foreign forces, which is something that lacked any positive resonance.  It would be easy to say we should ‘keep it simple’ and transmit basic messages, but even the simplest of messages requires some perspective-taking to understand how it will be understood and acted upon by the receiver of the message.  I observed first-hand a US general telling his subordinate commanders to tell Afghans who may ask why we were in Afghanistan to provide a one-word response:  “Revenge.”  In addition to getting approval from the theater commander for this message, he believed this would resonate with a population steeped in an honor code.  It was his attempt at perspective-taking.  Sadly, each audience interpreted the message a little differently, and each evaluated a little differently against whom revenge was being taken.  The effect was often not the one desired.

This book is about how we often fail to learn and adapt at tactical, theater, and strategic levels when assumptions meet reality.  Again and again, they cite instances of how we must often question our assumptions and that when we DID take the time to learn how we might achieve an advantage in a complex counter-insurgency combined with counter-terrorism operations, we found the needed actions challenged our very conception of what constitutes a military operation.

Strategically, the inability of both the US and UK governments to recognize the centrality of the population in both conflicts deserves much criticism but that is not the domain of military planners and the authors recognize that a misstep in a capitol thousands of miles away is no reason to repeat the mistake in tactics used on the ground.  Many examples are cited in the book about how planners were faced with the lack of proper tools, doctrine, training, and guidance to achieve tactical objectives on the ground.  MacKay and Tatham realized they needed to review their understanding of human behavior in their unit’s area of responsibility.  They were fortunate to have had the opportunity to observe decision-making and behavior in previous conflicts and understood that while conditions were different in each operation, the humans were much the same in how they utilized information and applied it within their own culture and environment. Commanders needed an understanding of why Iraqis or Afghans made decisions that, to the outside observer, seemed inimical to their own interests but made perfect sense to those being affected by military operations.  Those decisions often affected the outcome of military operations.

MacKay and Tatham point out many mistakes made by leaders at all levels throughout these two conflicts and others.  How they overcame these mistakes is the most important lesson to draw from this book.  These leaders adapted and used the tools necessary and not necessarily those they were given by their military.  A basic understanding of the society, its history, and the people’s motivations are all necessary to begin to understand the local information environment.  While each culture is distinct, the authors also point out the many commonalities of human behavior and the extensive resources available to help leaders better understand this component of the battlefield.

There are many reasons we should understand our past, and examples of successes are always a pleasure to read, but it’s more important to read about our failures, whether because of misinformed leaders, poor planning, or a more adaptable adversary.  The primary reason will always be:  To Not Repeat The Mistakes of the Past. 

All OIE leaders should read this and other books about our experiences of the past 25 years.  Gen Mattis’s advice is all you need to justify doing so, but every leader should seek to learn from our mistakes.  The book about our successes in Afghanistan and Iraq OIE has not yet been written whether for classification reasons or lack of interest but even if it is written, we won’t learn as much from that as from books like that written by Maj Gen (ret) MacKay and Commander (ret) Tatham who showed us that despite all the shortcoming in doctrine, training and capabilities, we do have the ability to learn and adapt.  Challenging our assumptions and adapting to our ever-changing environment is not the easiest task, but it is an essential one for all OIE.

 

LTC (USA ret) Michael Williams is a retired Army Information Operations officer.  Following his retirement in 2006, Mike became an advisor to the Information Operations policy office in the Office of the Under Secretary of Defense for Intelligence and later to the Undersecretary of Defense for Policy and worked on Secretary Gates’ program review of IO among many other projects over a seven year period.  Since leaving his advisory position, he has become a Senior Analyst for Cognitive Performance Group which focuses on analyzing decision-making and developing models of expertise in support of training and leader development in DoD.  Mike was the Executive Director of IPA during its formative period and currently supports IPA’s efforts to expand understand of Cognitive Security. Mike also owns a real estate brokerage in a small town in the Finger Lakes region of central New York where he makes his home.

The post Review of Behavioural Conflict: Why Understanding People and Their Motives Will Prove Decisive in Future Conflict appeared first on Information Professionals Association.

]]>
16789
The Cognitive Domain is Where Ghosts are Real https://information-professionals.org/the-cognitive-domain-is-where-ghosts-are-real/ Fri, 25 Oct 2024 17:33:30 +0000 https://information-professionals.org/?p=16599 by Dr. Sean Guillory “Monsters are real, and ghosts are real too. They live inside us, and sometimes, they win” – Stephen King Link to photo source and What this […]

The post The Cognitive Domain is Where Ghosts are Real appeared first on Information Professionals Association.

]]>
by Dr. Sean Guillory

“Monsters are real, and ghosts are real too. They live inside us, and sometimes, they win” – Stephen King

Link to photo source and What this photo is about

Let’s start this Halloween-themed piece of writing with a confession: I don’t know what the cognitive domain is. 

I know: sheer heresy.

I work with the cognitive domain a lot (got it in my job title and I work with organizations like the IPA whose primary mission is focused on this space), but still, some of the hardest questions that I can get areWhat is the Cognitive Domain? and Where is the Cognitive Domain? (and unfortunately these are usually going to be the first questions that someone new to this work would want to ask).

Now, there are some good definitions for the cognitive domain to put forward for consideration (I think the best one routinely cited by national security writers comes from Paul Ottewell’s Defining the Cognitive Domain paper), but in my opinion, none of them really explain the “Where” and “What is it” well.

There’s also a more complicated question that I haven’t seen mentioned in the cognitive security/warfare thought leadership circles that is important to address: What is the material proof that the cognitive domain is real?. This may come off as a ridiculous question based on the experience of being a human and having ideas and having those ideas understood when you share them, but even though it feels ever-present to one’s conscious life, where can a person show this “obvious” thing to other people without having to ultimately trust the self-reported experience? It’s the same thing as trying to prove that dreams are real: even with extensive historical records and one’s personal experiences with them, there’s no material proof that other people actually dream beyond the trust one puts on the self-reports. Sure, dreams have been frequently studied, and interesting correlations have been found in neuroscience (including recent attempts by scientists to use sounds to trigger specific experiences in a dream) but until technology gets to the Being John Malkovich-level of experiencing another person’s embodied experiences, the experience of dreams (or the cognitive domain) is dependent on the foundation of trust in the self-reports of other people.

I recognize that these Hard Problems of Consciousness the Cognitive Domain (What is, Where, Is it Real?) are a bit radical (even the thought leaders that challenge the need to include the cognitive domain in operations planning don’t challenge the existence of the domain itself), but this seems like the right time of year to share with you the questions that keep me up at night. Like I said at the beginning of the paper, I still don’t have the answers, but in the holiday spirit, let me give you a Halloween-themed fun-sized treat that I hope can get us to the sufficient answers someday:

The cognitive domain is where ghosts are real.

By that spooky phrase, I don’t mean that there is a sort of paranormal function in our cognitive beliefs (if that’s what you’re looking for, check out this literature review). I mean that when planning operations that use elements typically associated with the cognitive domain, planners need to treat the elements that the targets truly believe are real as “functionally real” even if the planners don’t believe they are real. Let’s unpack this a bit before going into examples:

  •  By elements of the cognitive domain, I again unfortunately don’t have a good definition, but I do have a morbid hypothetical about an Unconscious Universe that I frequently use to see if it would be a cognitive domain element or not: if all higher-level cognitive beings were erased from the universe (obviously humans, but I don’t know where the line should be drawn for the rest of the animal kingdom), would said element still exist or not? I find this intuition pump handy in differentiating the more information-based elements that would still exist without cognitive beings (e.g., written language, data infrastructure, laws, etc.) from the cognitive-based elements that would not exist without cognitive beings existing (e.g., thought, belief, deterrence, norms, etc.).
  • By functionally real, I am not saying that the planners need to necessarily act like the target’s beliefs are 2+2=4 levels of true or to prioritize the target’s beliefs over beliefs that planners feel that they have much more justified evidence on. I am saying to acknowledge and understand a target audience’s beliefs as much as one can and to be open to using those beliefs in cognitive security operations planning like they are functionally real.

And by “ghosts,” well…I do mean a couple of things, including what many folks think of as ghosts, but let me show you what I mean with a couple of historical examples:

  • After World War II, the U.S. government was tasked to help the recently independent Philippines fight the communist anti-Japanese rebel group, the Hukbalahap (Huks). Air Force officer Edward G. Lansdale and the newly formed CIA started to develop a plan not just to fight the Huks, but sway Filipinos from joining them. So what did the CIA do? They turned to Filipino folklore and the story of the Aswang, which you can think of as a sort of vampire. Lansdale’s group would hunt Hukbalahap rebels, and after their deaths, they would puncture holes into their neck, drain the blood from the body, and this would make the rebels and the villagers think that an Aswang attacked the dead rebel. The rebellion was put down within two years of the start of the Aswang operation.
  • Operation Wandering Soul was a Vietnam War-based operation where the U.S. Army would play audio at night (an example being what is now called Ghost Tape Number Ten) of wandering souls on the battlefield. The idea was born from the Vietnamese cultural idea that those not given a proper burial (like many of the soldiers who died in battle) would continue to wander the earth. Results of this operation were seen as mixed, but two other non-spooky effects came from the operation: (1) it kept the adversary up with the audio, and (2) sometimes adversaries would fire towards the sounds, and this would give away their position.

The ghosts can be in the form of monsters that are believed in (like the stories above), or it can also be in the gods that are revered:

  • The story of the Trojan Horse is popularly thought of as the Trojans being fooled by the generosity of the Greeks, but the real reason why they brought the horse into the city was in fear of offending the goddess Athena (see the fate of Laocoön who tried to convince his fellow Trojans to burn the horse).
  • The Persians annexed ancient Egypt with the Battle of Pelusium of 525 BCE by bringing cats (an animal considered sacred to the Egyptians) onto the battlefield and using them as hostages. The Egyptians, being afraid to shoot the cats with their arrows, allowed for the Persians to storm the Pelusium and win the battle.
  • The Ancient Romans had a practice called evocatio where, shortly before attacking a city, the Romans would ritually reach out to the deities of a city to ask for their favor with the promises of even more temples, worship, and renown. On top of the spiritual insurance of avoiding the repercussions of sacrilege, it motivated the Roman army to believe that victory was certain. Evocatio wasn’t just a technique to motivate the side doing the ritual but also demoralizing the opposing side. Many stories of the Old Testament talk about the fear and panic of losing divine support.

Beyond treating ghosts and gods as functionally real, I also wanted to bring up the seemingly opposite entity of the ghost: the black swan. Suppose the ghost is one where to a person the entity appears real, obvious, yet hard to prove it exists. In that case, the black swan is not obvious, seems so unlikely to happen that it usually isn’t considered possible, but one event can undeniably prove that it can happen and show that the conclusion gathered from inductive reasoning was a ghost all along. No one expects billiard balls, rats, donkeys, cigars, the sounds of a crying baby, pagers, or walkie-talkies to explode, but that inference is simply a ghost that is taken for granted as real.

Considering all this, let’s go back to the Hard Problems of the Cognitive Domain. I still don’t know how to answer them. But I’m confident in saying that the ghosts/gods/obvious conclusions of the cognitive domain can possibly be anything, anywhere, at any time. That is probably a terrifying proposition (Happy Halloween!) but note that the acme of a cognitive security attack is either where (1) the target is facing an unknowable, incomprehensible Lovecraftian monster where “The words reaching the reader can never even suggest the awfulness of the sight itself [where it] crippled our consciousness so completely” or (2) the target doesn’t notice that anything has changed. Now let’s be real: most cognitive security attacks are closer to Punk’d than Poltergeist, but the principles of working with the ghosts that are believed by the target to be real still applies.

In terms of other functional considerations for working with ghosts in the cognitive domain, let me list a couple (all of them probably deserve a paper of their own, but this paper itself is already running a little long):

So, are ghosts real? Are dreams real? Are gods real? My answer is that they are as real as the cognitive domain is.

What is the cognitive domain? Where is the cognitive domain? Is the cognitive domain real? I still don’t know, but there are a lot of advantages in warfare in treating the cognitive domain like it’s real, so I treat it like a ghost…or a dream…or a…

About the Author

Dr. Sean Guillory utilizes his cognitive neuroscience training to help with cognitive/human domain capabilities within Defense and National Security. He is also a Board Member of the Information Professionals Association.

The post The Cognitive Domain is Where Ghosts are Real appeared first on Information Professionals Association.

]]>
16599
Information Operations: The Primer I’ve (You’ve) Been Looking For https://information-professionals.org/information-operations-the-primer-ive-youve-been-looking-for/ Fri, 13 Sep 2024 03:42:33 +0000 https://information-professionals.org/?p=16170 I am incredibly thankful for Steve Tatham’s latest book, Information Operations: Facts, Fakes, Conspiracists. I am often asked by colleagues what books they should read to quickly understand the environment […]

The post Information Operations: The Primer I’ve (You’ve) Been Looking For appeared first on Information Professionals Association.

]]>
I am incredibly thankful for Steve Tatham’s latest book, Information Operations: Facts, Fakes, Conspiracists.1 I am often asked by colleagues what books they should read to quickly understand the environment and capabilities that comprise this realm of warfare. While I’m accustomed to pointing people to Like War (Singer and Brooking), Information Operations now stands in a similar pre-eminent place on my bookshelf and will serve as one of my “go-to” recommendations for novices and information professionals alike.

Why? Because Steve Tatham speaks with unimpeachable authority. Unlike more academic treatments, Information Operations provides real case studies backed up by Steve’s intimate involvement in many of them as a former Royal Navy information professional who served across the UK defense establishment in and out of uniform. On a personal level, I was delighted to find that my former command benefitted from the groundwork Steve’s 15 (UK) PsyOps Group established during Op Herrick in Afghanistan. I deployed with Marine Expeditionary Brigade-Afghanistan to Helmand Province in 2009 where we integrated with the Royal Air Force IO division at Camp Bastion to synchronize influence operations in defense of our now combined strategic operating base and airfield. Later, as commander of II MEF Information Group, I worked extensively with the Royal Marines’ version of my command, 30 Commando, and found Steve’s treatment of recent UK information operations very familiar.  

So what’s in store? As a further testament to his expertise in the field, Steve starts in Chapter 1 by defining exactly what he means in using the term information operations (IO). On the American side of the profession, we still seem to be awash in multiple terms when describing this realm: operations in the information environment, information warfare, etc. Steve is right to nail this down from the beginning and settles on a fairly broad approach that includes “everything that gets done in the information environment” except cyber. Excluding cyber from IO is a longstanding debate, and Steve is right not to waste his time on it to keep his treatise, particularly from a military IO perspective, focused on the art and science of informing and influencing audiences. If you are looking for more information on cyber operations, you should look elsewhere.

Speaking of where to look, Information Operations is one of those books you may want to “skip to the good parts” depending on your background. The early chapters on the UK (chapter 2), the US (chapter 3), past IO campaigns (chapter 6), and our mutual adversaries (chapters 7 and 8) are excellent primers of what has happened in the space. Servicemembers of my generation will want to take a trip down memory lane for IO successes and failures in Iraq and Afghanistan (chapter 4). And Steve does a masterful job weaving in current events like the war in Gaza and the ongoing conflict in Ukraine so the book is contemporary. While the information professional will be familiar with these examples, Steve structured the entire book to help them delve deeper. Information Operations is referenced with a supporting website (IOFFC.info) providing audio and video resources for each chapter as a great jumping-off point for further research. 

I think the best parts of Information Operations come in chapters 4 (Audience) and 9 (Psychology). These chapters detail the hard work and time-tested approaches that make IO effective. As a warning to the intelligence professional, all the intent in the world won’t bring real change in a target population’s behavior without a detailed cultural understanding of that audience. And Steve is right to warn that you can only glean so much information about that audience from afar. Technology is helpful but won’t give you all the answers in this realm, so you need to spend some time with “boots on the ground.” Chapter 9 gives a similar warning about the intent to win “hearts and minds.” A tendency of the IO community of late, and leaders looking for quick wins, has been to leverage commercial marketing techniques to shift the attitudes of target audiences. Steve shows the folly of this approach with several examples and rightly points us toward the goal of behavior change as something entirely possible: “if you know your audiences well enough and understand a little of the psychology of human nature, then predicting human behaviour in certain circumstances is perfectly possible.” (pg 172).

Steve closes the book with a treatment on conspiracy theories that will interest anyone immersed in the modern information environment, perhaps all of us. His detailed overview of the Cambridge Analytica affair (chapter 11) is fascinating in its own right but I think it serves as a cautionary tale for leaders in the space looking for any “magic bullets” to solve their IO challenges. Especially in the military community, information has recently taken a prominent position, and there is no lack of businesses offering quick-win solutions to these challenges. Leaders would do well to take a hard look under the hood at these capabilities to see if they match what Steve lays out in this book before buying.

I am already looking forward to Steve’s next book. I welcome his thoughts on the emerging trend of artificial intelligence (AI), which he touches on briefly in his last chapter. My overall takeaway from Information Operations, supported by my own experience, is that this space has no free wins. In the human realm, effects and behavior change come from hard work, a deep understanding of an audience, and persistence to affect the outcome you desire. Technology will always be helpful but won’t completely win the day. Steve reminds us of this, so Information Operations is well worth your time and further study. It is also well worth your investment. All book royalties go to Hounds for Heroes, an organization providing specially trained assistance dogs to injured and disabled men and women of the UK Armed Forces and Emergency Services.

Brian Russell is the founder of Information Advantage and a Key Terrain Cyber Senior Fellow. He is a retired Marine Corps artillery officer, with previous assignments as the commanding officer of II Marine Expeditionary Force Information Group (II MIG) and 1st ANGLICO. His combat deployments include serving as the Military Transition Team Leader in Habbaniyah, Iraq, the executive officer of Brigade Headquarters Group in Helmand Province, Afghanistan and Plans Director in Bagram, Afghanistan. Some of his notable staff assignments include: Operations Directorate at Marine Corps Special Operations Command, Operations Directorate at United States Cyber Command, and U.S. Plans Directorate at Marine Corps Forces Cyberspace Command. He recently joined Peraton as a Cyber and Information Warfare subject matter expert. He is also an Information Professionals Association member.

1    Tatham, Steve. Information Operations: Facts Fakes Conspiracists. Howgate Publishing Limited. 2024.

The post Information Operations: The Primer I’ve (You’ve) Been Looking For appeared first on Information Professionals Association.

]]>
16170
The Kremlin’s Information War as Counterbalance to Western Commitment https://information-professionals.org/the-kremlins-information-war-as-counterbalance-to-western-commitment/ Mon, 02 Sep 2024 20:40:33 +0000 https://information-professionals.org/?p=16152 The Kremlin’s Information War as Counterbalance to Western Commitment  Author: Jerry E. Landrum, PhD In the early days of the Russo-Ukraine War, the Ukrainian government initiated an information campaign to […]

The post The Kremlin’s Information War as Counterbalance to Western Commitment appeared first on Information Professionals Association.

]]>
The Kremlin’s Information War as Counterbalance to Western Commitment 

Author: Jerry E. Landrum, PhD

In the early days of the Russo-Ukraine War, the Ukrainian government initiated an information campaign to solicit military support from powerful Western states.  The video of Volodymyr Zelensky with his cabinet in military fatigues defiantly announcing, “We are all here defending our independence, our state, and it will remain so. Glory to Ukraine!” was central to this effort. Only 30 seconds long, the video was meant to project Ukrainian resolve in the face of Russian aggression. Every subsequent video of Ukrainian success shared on social media demonstrated that Ukrainians might, against all odds, be able to repel the initial assault, and Western leaders responded.

As the narrative of possible success proliferated, Western leaders signaled resolve and committed valuable war materials to Ukraine’s defense. The US and Baltic states provided Javelins to the Ukrainians. As Ukrainian successes were communicated over social media, the military aid increased with Next Generation Light Antitank Weapons from Luxembourg, Bayraktar TB-2 drones from Turkey, T-72 tanks from the Czech Republic, and, most significantly, High Mobility Artillery Rocket Systems from the United States. The latter vastly increased the Ukrainian army’s lethality and targeting range and enabled their ability to disrupt Russian lines of communication.  From April to December 2022, this critical support played a significant role in Ukraine’s ability to stop Russia’s advance on Kyiv and surprised many specialists. 

Ukraine’s success forced the Kremlin to change its initial theory of victory, to quickly seize Kyiv and conduct regime change.  Instead, the Russian military chose to adjust its operational approach by consolidating territorial gains in the Donbas region.  To be sure, the strategic goal of regime replacement remains Russia’s long-term objective.  A war of attrition is the defeat mechanism to destroy, dislocate, degrade, and disorient the Ukrainian army in the Donbas region.  In order to achieve this end state, Russia must disrupt Western support for Ukraine. 

Decreasing public support in Europe and America suggests that Russia’s information warfare has challenged popular resolve to support Ukraine and given the Kremlin a strategic advantage. I argue that Russia’s ability to generate counterbalancing audience costs disrupted Western support to Ukraine, and declining popular support in Western Europe indicates that Russia’s information warfare is achieving effects.  The case study in Grossenhain, Germany provides an illuminating example of these effects.  To reduce Russia’s ability to use disinformation to generate counterbalancing audience costs, the West should respond quickly, continue investment in information warfare infrastructure, and educate the public about how to identify disinformation. 

Information Warfare and Audience Cost

Political scientists have theorized about the effect of “audience cost” on the foreign policy decisions of leaders of liberal democracies.  The logic behind this perspective is that democratic leaders have structural accountability for their decisions. Elected legislatures, established bureaucracies, active interest groups, and the general electorate punish democratically elected leaders for perceived incompetence on foreign policy matters.  Western leaders, therefore, face significant challenges from opposition groups as they try to communicate how increased investments will assist Ukraine and why it is a vital national security issue for each respective nation. This is especially challenging in a war of attrition where gains are slow and the benefits of investment are not readily apparent. In many ways, audience costs are related to public opinion about the efficacy of a given policy. Public opinion in liberal democracies constrain the decisions of elected leaders who hesitate to back down on foreign policy decisions to avoid perceptions of incompetence. Therefore, information warfare in the Russo-Ukraine War might be understood as a contest for generating audience costs through perceptions that enable or disrupt efforts to provide material support to Ukraine. 

Material support serves as a “sunk cost” signal that creates audience costs for decision makers. These costs are self-generating in that leaders make deliberate decisions to incur costs, thereby signaling commitment credibility. While these decisions are costly in terms of fiscal resources, they also impose a significant toll on political capital. Thus, Western leaders committed national treasure in 2022 to the defense of Ukraine, and backing down risks signaling perceptions of incompetence. Current Western leaders, therefore, are deeply committed to Ukraine’s defense.  The risk of incurring audience costs from perceived incompetence for backing down makes a policy reversal unlikely.  Russia’s disinformation campaign seeks to create a counterbalancing narrative that makes the audience cost of continuing support to Ukraine more painful than the audience cost of backing down.

Achieving Counterbalancing Effects: A German Case Study

German Chancellor Olaf Scholz leads a left-leaning coalition government composed largely of Social Democrats and the German Green Party, and his government was an early proponent of supporting Ukraine’s defense. In 2022, the German government provided €2 billion ($2.1 billion) in military support. This support increased to €5.4 billion ($5.8 billion) in 2023, and reports suggest that German support is projected to reach €8 billion ($8.6 billion) in 2024. The German government is clearly committed through “sunk costs” commitments to supporting Ukraine, and the Scholz government is loath to back down for fear of suffering audience costs. 

The Kremlin seeks to counterbalance this commitment through information warfare.  According to the German Ministry of Interior, Russian propaganda “spreads false claims to try to justify military invasion, hide civilian casualties, and cement its narrative of the anti-Russian West.”  The primary target of the propaganda is the Putinverstehers, which translates to “one who understands Putin.”  These are Kremlin sympathizers who often repeat Putin’s propaganda line.  Putinverstehers is a sentiment found across the entire German political spectrum, but it is most prevalent in right-wing political parties.  The situation in Grossenhain, a small town in Saxony, provides an example of how Russia effectively targets its propaganda to influence Putinverstehers to generate counterbalancing audience costs.

To meet increased demand for Ukraine armaments, Rheinmetall, Germany’s largest arms manufacturer, decided in 2023 to open a factory in Grossenhain. This factory would improve the economic prospects of the entire community and enable Germany’s support to Ukraine. Still, after Rheinmetall announced the decision, the City Council sent a letter to Scholz asking him to block the move, and Alternative for Deutschland (AfD), an ultra-right-wing German political party, organized a rally protesting against Rheinmetall’s factory.  

Putinverstehers at the rally condemned further Ukrainian support with Russian talking points. AfD leaders generated a petition at the rally stating the town rejected “a further economic-military use” of the labor force. This is related to Russian disinformation on several social media platforms that Germany is mobilizing its military and preparing to enter the war in Ukraine.  A center-right member of the Christian Democratic Union in the Saxony legislature said, “It’s difficult to explain to people why we should support Ukraine.” This comment falls in line with Putin’s claim that the West is responsible for the war and that the Ukrainian people have “become hostage of the Kyiv regime and its Western masters.”  A Grossenhain city councilor amplified the false narrative of Russian victimhood saying, “I can imagine that Putin is feeling squeezed because NATO is slipping closer and closer.” Another AfD member of Saxony’s state legislature said that Western support was “putting us all at risk” for Russian retaliation.  This comment demonstrates how Russia uses information about putting its nuclear arsenal on “high combat alert” and conducting “readiness drills” to scare the public, which mobilizes opposition against providing more support to the Ukrainians.

The Grossenhain case demonstrates how Russia’s talking points infiltrate policy debates, whether wittingly or unwittingly, through Putinverstehers in opposition parties to create counterbalancing audience costs for Western leaders.  For the time being, the Scholz government appears committed to supporting Ukraine.  However, the opposing AfD party, citing Russian talking points, resists continuing support. This is significant as the AfD continues to increase its political position in the German electorate. In 2017, the AfD polled at 10 percent of the German electorate. Over the last year, the AfD’s poll numbers skyrocketed to 22 percent making it the second most powerful political bloc in the country, and some predict the party will fare very well in the 2025 federal elections.  Thus, given their stated policy of opposing Ukrainian support, increased power for the AfD would be a beneficial political outcome for the Kremlin. 

What is to be done? 

Because of the West’s commitment to free speech, countering Russia’s propaganda is an immensely challenging task. Scholars in various security organizations and institutes have written much on how to thwart Russia’s propaganda effort, and they all more or less coalesce around the following policy recommendations:

  • Establish Defensive Infrastructure

    The 2022 US National Security Strategy describes Russia’s attempt to “sow divisions among the American people” and “countries across Europe, Central Asia, and around the world” as a destabilizing threat.  In view of this fact, the West must continue to invest in its defensive infrastructure. The creation of organizations such as the Czech Center Against Terrorism and Hybrid Threats, the U.S. Global Engagement Center, and NATO STRATCOM are examples of investments to confront the Russian disinformation threat.  Measuring the effectiveness of these organizations remains a challenge.  However, given the recognized pervasiveness of Russian information warfare, the need for their existence is clear for those who accept the significance and power of the information instrument of national power.

  • Speed

    The West must seek to achieve a first mover advantage against the Russians. The US, for example, advertised Russia’s invasion of Ukraine several months prior to the commencement of operations and undercut a potential false flag operation. There are intelligence loss/gain tradeoffs in this approach, but it can be effective for preempting a disinformation attack. The EU’s establishment of the Rapid Alert System, a system meant to quickly identify and respond to disinformation, is a good example of investment in defense against information attacks.  As Christopher Paul and Miriam Matthews have pointed out, early response to Russian information fabrications helps mitigate the effectiveness of the Kremlin’s disinformation activities.

  • Censorship

    Because of concerns that published information might help the enemy, World War Two leaders endorsed the practice of censorship to prevent information from being used to achieve advantage.  In a similar fashion, the EU banned broadcasts of RT and Sputnik inside Europe. This approach is difficult because it violates notions of free speech, and there are plenty of conduits for dissemination that Russia can exploit. The effectiveness of censorship, therefore, is questionable. Still, for acute problems of disinformation in gray zone (competition below the level of armed conflict) activities, censorship should remain a viable option that is used with caution.  The intent of the media outlet is the key discriminator on whether to censor.  There must be a clear intent for propaganda for the media outlet.  RT and Sputnik are state organs of the Kremlin and are candidates for censorship.  This is different from a media outlet that freely reports on an issue and happens to comment favorably on a Russian talking point.  The lines are blurred, which is why caution is mandatory with this method.

  • Education

    This approach emphasizes two techniques. First, governments must educate citizens on how to identify disinformation. Finland, for example, makes a deliberate effort to teach students critical thinking skills. Second, simply propagating the truth is important. The EU launched a website, EUvsDiSinfo, to counter Russian propaganda with the truth. These approaches have lag times in terms of effectiveness, but governments have rightly embraced them as a viable form of defense.

  • Public-Private Cooperation

    Given the distrust of established institutions, one of the most promising forms of information defense is through public-private cooperation. The Estonian Defense League (EDL), a group of private citizens committed to defending their country from Russian attack, works tirelessly to counter Russian propaganda. Groups like the EDL are perceived as independent from government control and highlight how resilient a society can be to disinformation when coalesced around a common vision.  However, the independent nature of their operations makes them difficult to control, which is a risk that must be considered.

Conclusion

The current leaders of Western democracies are committed to supporting Ukraine partly because they do not want to suffer audience costs. Russia views Western support as a critical capability for Ukraine’s defense. To achieve victory, Russia must disrupt, decrease, and, if possible, eliminate Western support. One tactic they pursue in this regard is the use of information warfare to generate counterbalancing audience costs. The preferred technique is to saturate the information environment with semi-plausible talking points that opposition parties use to gain political power, which increases pressure on already committed Western leaders. As noted in Grossenhain, Germany, Russia has achieved some success in generating counterbalancing audience costs.

Author

Jerry E. Landrum is a U.S. Army officer and faculty instructor at the U.S. Army War College. He spent most of his career serving as an information operations officer. He holds a PhD in Security Studies from Kansas State University where he studied U.S. European security policy at the end of the Cold War. The views and opinions presented in this article represent those of the author and do not represent the Army War College, the U.S. Army, the Department of Defense, or any part of the U.S. government.

The post The Kremlin’s Information War as Counterbalance to Western Commitment appeared first on Information Professionals Association.

]]>
16152
Update from the US Army War College’s Influence and Information Advantage Elective https://information-professionals.org/update-from-the-us-army-war-colleges-influence-and-information-advantage-elective/ Fri, 26 Jul 2024 04:11:53 +0000 https://information-professionals.org/?p=16029 Update from the US Army War College’s Influence and Information Advantage Elective: Information Wargaming Lessons Learned Authors: CDR Michael Posey, Mr. Joseph Wheaton, COL Jerry Landrum, PhD In this article, […]

The post Update from the US Army War College’s Influence and Information Advantage Elective appeared first on Information Professionals Association.

]]>
Update from the US Army War College’s Influence and Information Advantage Elective: Information Wargaming Lessons Learned

Authors: CDR Michael Posey, Mr. Joseph Wheaton, COL Jerry Landrum, PhD

In this article, we will discuss how we used three different influence wargames at the US Army War College to achieve learning objectives during an information-centric elective and share our reflections on the wargames and our own performance.

Military professionals understand the complexity and importance of influence. From the early stages of our military careers, we are exposed to leaders who shape our units and witness the effects of our military actions in the information environment. National security professionals intuitively grasp the “understand” and “leverage” functions of information described in Joint Publication 3-04 Information in Joint Operations.  These concepts are central to clarifying our own decision-making and altering the decision-making of our adversaries.  The U.S. Army War College extensively studies, discusses, and thinks strategically about influence. However, applying influence, especially offensively, is a complex and challenging task against a thinking adversary. Information advantage is a crucial factor that generates operational military advantage, especially as it relates to tempo and creating dilemmas for adversaries. It can be achieved when an adversary’s decision-making process is disrupted, usurped, corrupted, or influenced. Information advantage provides windows of opportunity for military commanders to achieve operational effects against their opponents, but like any military action, it requires practice and strategic thinking. 

Teaching national security professionals about information advantages is a daunting task. However, wargaming provides a low-risk, immersive, and participatory experience that enhances teaching cognitive influence techniques, inspires strategic thinking, and motivates students to understand the nuances of influence activities. Wargaming is not just fun but a tangible and dynamic way to practice decision-making and develop critical thinking skills. We firmly believe that wargaming can push influence lesson objectives higher up the triangle in Bloom’s Taxonomy (see Figure 1), a hierarchical educational classification learning model. In simpler terms, with wargaming, students can put what they have learned into action, not just understand it, thereby enhancing their retention of influence knowledge. In our Army War College elective, Influence and Information Advantage, we harnessed the power of wargaming to elevate our learning outcomes on Bloom’s Taxonomy for our thirteen students.

After teaching our information elective in 2023, we wrote for IPA about employing a commercial off-the-shelf (COTS) wargame, War of Whispers. As our Vice Chairman of the Joint Chiefs of Staff, Admiral Christopher Grady, notes, wargaming allows for constantly improving concepts. We firmly believe in the importance of wargaming in educating the Joint Force, especially with an inherently human topic like influence. In 2024, we again leveraged wargames, which we see as a natural pairing to influence. After covering foundational readings on the history and philosophy of information, we focused academic inquiry on the technical aspects of a narrative, Military Deception (MILDEC) doctrine, and case studies like Fortitude and Desert Storm.  We also assigned readings about influence wargaming, such as that from the United Kingdom Ministry of Defense Influence and Wargaming Handbook.  Finally, we introduced new material from our colleague Allison Abbe’s brilliant piece that covers the importance of strategic empathy and perspective-taking.  We experimented with three information-related wargames to elevate learning from these assignments: War of Whispers (for a second time), NATO Mission: Debunk Twister, and Malign.

Revisiting War of Whispers

We played War of Whispers three times during our ten-class elective. We wanted to familiarize players with the rules, so we introduced the game as an icebreaker during the first day. Halfway through the elective, we played a second time following the persuasion and MILDEC lessons.  Beforehand, we reviewed Cialdini’s seven principles of influence and MILDEC basics. We encouraged table talk and negotiating for the second game. We thought this timing and the quick review might be appropriate because the players employ agents and use deception, concealment, influence, information, and misinformation to gain power over kingdoms.  Finally, we played War of Whispers a third time on our last day of class to see if the students could perform better after more practice with the game. For the winner, the player who can manipulate and orchestrate the rise and fall of the empires to that player’s secret advantage, we offered only the incentive of “bragging rights.”

In playing War of Whispers, our lesson objectives were to:

  1. Apply OPSEC, MILDEC, and influence basics to gain an advantage over your classmates in a practical experiential learning exercise.
  2. Understand the limitations of influence theory in a resource and time-constrained practical, experiential learning exercise.

While we definitively achieved lesson objective b, we struggled to meet lesson objective a. Although some students fully understood the game rules, many never fully comprehended them without a facilitator helping them. Our biggest mistake was not playing the game twice in a row during our three-hour sessions, as we did last year; instead, we played it once during each elective session after a seminar discussion beforehand. As we expected, during the first round of War of Whispers, the students were focused on understanding the rules and mechanics of the game instead of applying the information lessons and concepts. However, the students never became comfortable with the game mechanics.

To improve for AY25, we have three courses of action. First, we could record videos of sample War of Whispers gameplay—complete with table talk and samples of how to use influence and MILDEC/OPSEC within the game. Second, we could keep the game as an icebreaker but spend more time on the two other wargames we tried during our elective. Finally, and most likely, we will not play War of Whispers to dedicate more class time to the two other games we experimented with during our elective. 

Altering and Applying NATO Mission: Debunk Twister

Developed by Jānus Rungulis, Madara Šrādere, and Vitālijs Rakstiņš of NATO STRATCOM, NATO Mission: Debunk Twister is a card game that uses current real-world myths to guide gameplay.  The game designers advertise Debunk Twister as a tool to teach information resilience and media literacy through a “narrative battle” surrounding 14 myths about NATO-Russian relations. We thought this game would be useful to our graduates since there is no shortage of myths that Russia spreads about NATO, and NATO puts considerable effort into debunking Russian falsehoods.  These myths touch on well-known and contentious topics such as NATO expansion (past and future), NATO encirclement of Russia, and NATO’s purported post-Cold War purpose.  

 The game is straightforward to play and consists of each team receiving 30 cards with pictures of simulated social media posts.  After spending at least 10 minutes carefully examining the cards, the students begin playing the cards against each other, with the Red Team (Russia) moving first.  The Red Team is free to propagate any narrative they choose, and the Blue Team (NATO) must respond to the Red Team narrative.  After all the cards (see Figure 2) are played against each other, the instructors determine which team best executed their narrative strategy.    

Our lesson objectives while playing Debunk Twist were:

  1. Understand how narratives are constructed and form the basis for themes, messages, and the military application of information from both theoretical and practical perspectives.
  2. Understand Russia’s use of informational power prior to and during its 2022 invasion of Ukraine.
  3. Analyze NATO’s use of informational power in response to Russian activities.
  4. Apply narrative theory basics in practice using a NATO vs. Russia wargame.

Our assessment is that we only had moderate success in achieving the first learning outcome.  Only allowing the students 10 minutes to construct a coherent narrative is challenging.  More time allocation for creating a narrative strategy would be helpful.  For the second learning outcome, we assess a high level of success.  At the end of the wargame, all the students understood the myths surrounding the narrative battle between Russia and NATO.  The game only identified 14 myths, but the students inevitably extended the game boundaries and incorporated myths emerging from the ongoing Russo-Ukraine War.  The third outcome was also achieved in that the students analyzed how NATO might use social media to rapidly respond to Russian disinformation, primarily how this medium might be used to amplify pro-NATO narratives.  Finally, for the last learning outcome, the game successfully prompted the students to incorporate previous classroom discussions and material, which deepened learning across the board.

The simplicity of this game is its most significant advantage.  Most students can learn the simple rules in about five to ten minutes.  Also, the game’s simplicity allows moderators to modify the rules to best achieve the desired learning outcomes.  For example, we are experimenting with an idea to allow students more time to design an information campaign based on the USAWC curriculum related to strategy formulation, Cialdini’s principles of influence, and other related techniques to develop an information campaign to support an operational approach.  Allowing students additional time to create an information campaign would facilitate more coherent gameplay as each card play would theoretically relate to some information campaign objective.  We are also examining creative ways to distribute the cards to the team to drive gameplay better and allow moderators to inject unexpected events to see how each team adjusts its narrative strategy.  In summary, the NATO narrative game is easy to play and modify and provides an excellent opportunity to achieve desired learning outcomes comprehensively.

Finishing Strong with the Nation-State Game of Influence: Malign

Malign is an innovative card-driven educational game that explores the intricate world of malign influence, such as misinformation and disinformation. (Figure 3) Noted wargamer Sebastian Bae created the game with Emily Yoder, Grace Hwang, and Jared Cooper to teach influence in the information age. Students represent different fictional countries and engage in the competition continuum below the level of armed conflict. Students aim to increase their malign influence on other nations while simultaneously defending against influence attempts on their own populations by building resiliency through institutional strengthening and education campaigns. 

Students build influence campaigns by combining cards representing the campaign’s intent, method, and amplifier. They then activate these campaigns using creative storytelling, as students must justify their influence campaigns with realistic narratives. The game’s “effects result table” determines whether they successfully add malign influence, social resilience, or—with an unlucky die roll— campaign backlash, which does the opposite of what is expected of targeted population demographics. The game promotes a tense, engaging back-and-forth influence struggle. Through this immersive gameplay experience, Malign aims to educate students on the evolving threats of coordinated disinformation efforts while encouraging critical thinking on how societies can foster resilience against such tactics.

During this game, we sought to move the lesson objectives further up Bloom’s Taxonomy so students would retain and understand the lessons better than before. We designed the lesson objectives to equip students with the knowledge and skills necessary to navigate the complexities of the modern information environment. Our lesson objectives were:

  1. Understand the role of information power in supporting integrated deterrence.
  2. Comprehend narrative construction and how it applies to information advantage
  3. Apply influence in simulated real-world scenarios with limited resources and time constraints.  

Through this set of objectives, we wanted students to develop a nuanced understanding of the role of information in strategic-level integrated deterrence, narrative construction, and influence theory.

Using Malign to achieve our lesson objectives above, we quickly discovered that it is a well-designed and thoughtful game that creates an immersive and engaging learning experience for students. We played the game over two class sessions within one week of each other, with the first as a chance for the students to understand the game and the second for them to apply influence techniques. While we struggled with students applying what was learned over the course, once we laid the foundation by ensuring the students understood the game mechanics, we began to see learning occur. Towards the end of the second playthrough of the game, students were applying and thinking through all the lesson objectives, especially the second and third ones. The game’s only downfall was that our students did not pick up the rules as quickly as we hoped. However, the students did grasp the rules faster than War of Whispers, the COTS game we played. As such, we assess that we achieved all three lesson objectives for Malign.

The game accomplishes the first objective of employing information power through integrated deterrence by allowing students to construct and execute influence campaigns using different components such as intent, method, and amplifier cards. Students must use information power, represented by these cards, to achieve their objectives and deter adversaries. Malign’s game mechanics encourage students to consider the relationship between military and non-military tools and the coordination of various information operations and strategic communication efforts.

The “narrative rule” in the game directly addresses our construction of the narrative lesson objective. Students struggled to vocalize a campaign via a concise narrative during our first playthrough of the game because they were trying to learn the game. After the first game, however, students started providing brief narrative explanations for their influence campaigns, encouraging them to develop compelling stories and themes aligned with their strategic goals. This aspect of the game helps students understand the importance of crafting realistic narratives and connecting with target audiences through strategic messaging. At a fundamental level, Malign aided students in understanding the importance of linking narratives to military planning and campaigns.

Finally, the game’s resource management system, which requires students to spend their limited resources to activate campaigns, simulates the constraints faced in real-world scenarios. Malign’s resource allocation system aided us well in achieving our third objective, ensuring students could apply their influence techniques in a resource-constrained environment. As the Government of Accountability notes, modernizing operations in the information environment with finite resources remains challenging for the DoD. Additionally, the turn-based structure and time limits imposed by the game create a sense of urgency and pressure, mimicking the time constraints often encountered in practical situations. By experiencing these limitations firsthand, albeit in an abstract manner, students can gain some understanding of the challenges of implementing influence strategies with limited resources and time.

Balance of Gameplay vs Student Learning: We May Alter Malign

During the AY25 electives period, we will emphasize our gameplay with Malign, and we may make three minor adjustments. First, to enhance the understanding of narrative construction, we may incorporate more detailed feedback mechanisms or peer review processes for the narratives developed by students. We believe that doing so could encourage more profound analysis and refinement of storytelling techniques. Second, to better simulate resource and time constraints, we may incorporate more dynamic and unpredictable elements, such as random events or unexpected challenges that disrupt player plans and force them to adapt their influence strategies on the fly. Finally, we may add to the friction between order and freedom in information control, as students must constantly be aware of the ethical, legal, and practical implications of controlling information. We find that the game does not address this topic, so we may add a small “event deck” directly related to specific scenarios that introduce ethical dilemmas or legal considerations related to government control. This change could prompt students to grapple with these issues more directly.  More importantly, this would allow for a rich post-game discussion of governments’ tensions in managing misinformation and disinformation as they balance public safety, national security, and freedom of speech.

Our recommendations might help achieve each lesson objective. Still, we must balance implementing them with the cost of time (perhaps doubling it) and disrupting the ease of gameplay, which we thought was a plus to playing Malign. The game does an excellent job of touching on each lesson objective to strike up a further discussion. In sum, Malign offers a comprehensive and engaging approach to teaching the complexities of misinformation and disinformation, influence campaigns, resilience, and strategic deterrence. 

Finally, regardless of any alterations we may make, when we play employ Malign during the next academic year, we will strive to connect the fictional scenario outlined in Malign with the real-world military or political activities that the game mirrors.  We will do so by finding a couple of required readings for the students to review beforehand and then providing a brief time to discuss those points during our Malign after-action review with the students. 

In sum, we play-tested three wargames this year. Wargames are a dynamic and immersive narrative tool that aids in educating students to apply influence. Although War of Whispers is a lot of fun, we had a more challenging time meeting our lesson objectives this year. So, we will choose to invest our time next year into the NATO Debunk Twister– primarily because of how easy it is and Malign. Malign is the best educational wargaming platform for Influence and Information Advantage because of its comprehensive, immersive play. (It’s fun!). Still, playing Malign is also relatively simple and sparks great discussion on nation-states and malign influence in the information age. 

About the Authors

CDR Michael Posey is an active-duty Navy officer. Mr. Joseph Wheaton is an Army civilian.  COL Jerry Landrum, PhD is an active-duty Army officer. All three currently teach in the Department of Military Strategy, Planning, and Operations at the School of Strategic Landpower at the U.S. Army War College in Carlisle, PA.

Disclaimer

The views expressed are those of the authors and do not reflect the official position of the U.S. Army War College, Department of the Army, Department of the Navy, Department of the Air Force, Department of Defense, or U.S. government.

The post Update from the US Army War College’s Influence and Information Advantage Elective appeared first on Information Professionals Association.

]]>
16029
Cognitive Terrain Mapping https://information-professionals.org/cognitive-terrain-mapping/ Fri, 26 Jul 2024 03:53:52 +0000 https://information-professionals.org/?p=16034 Cognitive Terrain Mapping: Charting a way forward in Information Operations Authors: John Bicknell and Christian Andros This article describes Cognitive Terrain Mapping. Currently being prototyped by the US Army, it […]

The post Cognitive Terrain Mapping appeared first on Information Professionals Association.

]]>
Cognitive Terrain Mapping: Charting a way forward in Information Operations

Authors: John Bicknell and Christian Andros

This article describes Cognitive Terrain Mapping. Currently being prototyped by the US Army, it is a versatile new capability which employs complexity science and information theory to visualize the changing cognitive states of humans within systems. The Army is interested in using the capability within its information advantage (IA) portfolio. After a brief introduction, Cognitive Terrain Mapping is explained, and a use case with benefits is presented.

[Note: Readers interested in related national security applications which synthesize complexity science, operations research principles, and Information Theory should also read The Coin of the Realm: Understanding and Predicting Relative System Behavior and Cognitive Arbitrage: Complexity, Variety and Human Cognitive States Are Related.]

Introduction

The past two years have been a watershed period for Information Operations (IO) doctrine. The Joint Staff published Joint Publication (JP) 3-04, and the military services promulgated new publications that expand on integrating Information as the 7th Warfighting Function, as discussed by LtGen Jerry Glavy and COL John Agnello on the Cognitive Crucible podcast.

While each publication has different terminology and applies information with varied techniques, tactics, and procedures, each uses information to gain tangible and reinforcing advantages. JP 3-04, for example, has an entire Guide for the Integration of Information in Joint Operations (Appendix C) in order to gain an advantage. Likewise, in Army Doctrine Publication (ADP) 3-13, one such advantage is the “likely [emphasis added] psychological impact of their operations and tasks on relevant actor perceptions, attitudes, and other drivers of behavior. The inherent informational aspects of operations produce cognitive effects on threats and other foreign relevant actors, including fear, anger, or confidence.”

But how can warfighters practically integrate new informational capabilities and evaluate “likely psychological impacts?” This is a difficult proposition since military units often lack the appropriate skills and abilities to conduct detailed assessments–especially at scale. Additionally, given current methods, seasoned psychological operations practitioners and researchers question whether or not it is even possible to understand causal relationships between operations in the information environment (OIE) and desired effects. For these reasons, warfighters need new tools and techniques to better understand and engage with the information dimension. These tools must do more than simply observe the sheer volume of data or messages; they must identify key influencers and susceptible (or resistant) audiences where IO could have disproportionate effects, Cognitive Terrain Mapping is one such approach.

Cognitive Terrain Mapping

Measures of complexity–or variety–provide a lens into the cognitive state of humans who manage or participate within systems of interest. Yaneer Bar-Yam, a Cognitive Crucible podcast guest and noteworthy complexity scientist, asserts: “Information, complexity, and entropy are really kind of the same thing looking at them from slightly different perspectives.” During the same podcast episode, Prof. Bar-Yam mentions Ashby’s Law of Requisite Variety, which states that all systems must be able to respond to the variety of environmental stimuli in order to survive and continue goal pursuit. Stimuli “may be actively hostile, as are those coming from an enemy, or merely irregular, as are those coming from the weather,” according to Ashby. But, what happens to the human brain during moments of excessive environmental variety or complexity?

Claude Shannon pioneered Information Theory in the 1940s. Information Theory is a mathematical representation of how efficiently and clearly information can be transmitted. Current researchers observe that information transmission is affected by factors exogenous to the medium itself–such as noise–which prevent receivers from fully understanding the original message. Moments of information overload and system noise confound human ability to orient, prioritize, decide, and then act. Such system phenomena (as individual humans and groups make up larger complex, adaptive systems) may increase cognitive vulnerabilities at certain times. Careful analyses identify exploitable influence timing opportunities based upon these cognitive vulnerabilities that result in Information Advantages.

Cognitive Terrain Mapping estimates and visualizes these changing cognitive states of target audiences. It is an IA-enabling capability that applies especially during the competition continuum phase of operations. A Cognitive Terrain Mapping dashboard uses complexity science and information theory to visualize and capitalize on opportunistic moments within complex systems based upon changes in human cognitive states. Cognitive Terrain Maps are also domain-agnostic, data-agnostic, and 100% explainable. Cognitive Terrain Maps do not have to be geographic; any data from which human events may be derived are usable. Moreover, different exploitable cognitive vulnerabilities (or states) are revealed when Cognitive Terrain Mapping is applied to different systems. For example:

  • Space: Satellite orbital and maritime tracking data may be used to identify maneuvers from which [cognitive] planning and execution cycles may be inferred and future maneuvers predicted.
  • Cyber: Cyber event logs identify varieties of attacks which may be used to make inferences about enemy cognitive states and BLUE/RED system vulnerabilities.
  • Ad Tech 1: Bidstream data may be used to measure the complexity of mobile device usage which may be used to make inferences about the average cognitive state of target audiences.
  • Ad Tech 2: Similarly, varieties of emotional states contained within social media posts may be converted into a target audience complexity measure, which may be used to engage appropriately and advantageously.
  • Business: Event logs extracted from enterprise management systems contain activities which represent the cognitive states of managers and employees; these data may be used for multiple purposes such as performance coaching, organizational restructuring, and short-term teaming projects.

Use Case and Benefits

The US Army is currently prototyping a Cognitive Terrain Mapping capability which uses Global Database for Events, Language, and Tone (GDELT) data. GDELT provides a wealth of event data from worldwide news sources. Cognitive Terrain Maps fueled by GDELT data visualize the average cognitive states of system managers or participants based on global news reporting related to countries, persons of interest, multinational organizations, and even news outlets. Varieties of cooperative and conflicting events recorded within GDELT are converted into complexity measures which enable useful cognitive terrain perspectives:

  1. Location-specific maps measure and compare the average cognitive states of system managers or system participants. A system manager may be a national or local government official (for example: Secretary of Agriculture, Minister of Transportation, or Tribal Elder) who has an ongoing goal pursuit agenda. System participants include local populations which are experiencing the changing system dynamics.
  2. Foreign actor energy expenditure maps depict how, where, and when foreign countries have operated globally or regionally. Larger varieties of events involving a foreign actor imply more management persistence and effort. Cognitive heatmap movies tell a long-term, generational story of foreign influence.
  3. News outlet cognitive maps compare global, regional, or local editorial policies. Over time, changes in editorial policy may indicate changes in news outlet ownership or political power. New target audiences may be defined by clustering outlets with similar bias or news coverage, as evidenced by similar cognitive trends.

Cognitive Terrain Mapping provides a wide array of decision support to IO planners, analysts, and operators. It can highlight moments of system vulnerability (and also resilience) to direct, prioritize, and synchronize influence activities. These measures can directly enable IA by providing metrics to determine the “relative cognitive state” of a target audience across a number of relevant subject areas. These powerful measures enable planners to capitalize on system disparities or imbalances to maximize effects, focus effort, and create relative advantage. Additionally, these disparities and imbalances can be tracked over time.

Cognitive Terrain Mapping also creates quantitative metrics and other inputs to help commanders assess dynamic threats, opportunities, and vulnerabilities within the IE. Such inputs help form a Common Operational Picture (COP) to display relevant information within a commander’s area of interest tailored to the user’s requirements and based on common data and information shared by more than one command. The COP helps commanders at all echelons achieve shared situational understanding to enable planning, preparation, execution, and assessments.

These benefits gain greater efficiencies and promote convergence, an outcome created by the concerted employment of capabilities from multiple domains and echelons against combinations of decisive points in any domain to create effects against a system, formation, decision maker, or in a specific geographic area. Its utility derives from understanding the interdependent relationships among capabilities from different domains and combining those capabilities in surprising, effective tactics that accrue advantages over time.

Conclusion

By employing an innovative technology like Cognitive Terrain Mapping, modern warfighters can gain greater situational awareness, maximize and synchronize efforts, and reduce risk. This employment will highlight adversary activity, target vulnerabilities and create desired effects–all of which help the commander to gain IA.

About the Authors

John Bicknell is the CEO and Founder of More Cowbell Unlimited. A national security thought leader and passionate analytics visionary, he has written extensively on national security matters related to information warfare, critical infrastructure defense, and space situational awareness. Before retiring from the United States Marine Corps in 2010 as a Lieutenant Colonel, John served worldwide, most notably in Afghanistan and at the Pentagon. He led enterprise-level process intensive human resources supply chain projects designed to discover inefficiencies, architect solutions, and re-purpose manpower savings. In his corporate career, he operationalized an Analytics Center of Excellence for a large EdTech firm, among other accomplishments. John is Vice President for the Information Professionals Association and host of The Cognitive Crucible podcast. His Master’s degree from the Naval Postgraduate School emphasizes econometrics and operations research.

Christian Andros recently retired from the Department of the Navy after 32 years of service. An intelligence officer, he has a background in Kinetic and Non-Kinetic Targeting, Information Operations and Intelligence Analysis. His most recent position was Director of Intelligence at the Marine Corps Information Operations Center (MCIOC). He now serves as an instructor for the Information Environment Advanced Analysis Course.

 

The post Cognitive Terrain Mapping appeared first on Information Professionals Association.

]]>
16034
Countering Cognitive Warfare in the Digital Age https://information-professionals.org/countering-cognitive-warfare-in-the-digital-age/ https://information-professionals.org/countering-cognitive-warfare-in-the-digital-age/#comments Thu, 16 May 2024 05:00:32 +0000 https://information-professionals.org/?p=15636 Countering Cognitive Warfare in the Digital Age: A Comprehensive Strategy for Safeguarding Democracy against Disinformation Campaigns on the TikTok Social Media Platform Authors: Shane Morris, David Gurzick, Ph.D., Sean Guillory, […]

The post Countering Cognitive Warfare in the Digital Age appeared first on Information Professionals Association.

]]>
Countering Cognitive Warfare in the Digital Age: A Comprehensive Strategy for Safeguarding Democracy against Disinformation Campaigns on the TikTok Social Media Platform

Authors:

Shane Morris, David Gurzick, Ph.D., Sean Guillory, Ph.D., Glenn Borsky

Abstract

In the contemporary digital age, the battle for cognitive supremacy has extended beyond traditional arenas to the ubiquitous domain of social media platforms. Among these platforms, TikTok has emerged as an unexpected yet potent vector for state-sponsored disinformation campaigns. This study scrutinizes the deployment of Large Language Models (LLMs) by the Russian intelligence entity, the GRU, to propagate misinformation aimed at destabilizing Western democratic norms and undermining the geopolitical standing of the United States and its allies. Through a methodical approach involving web scraping tools and the strategic use of change data capture technologies, coupled with the deployment of Retrieval Augmented Generation (RAG) models, the GRU has executed a campaign of unprecedented sophistication. This campaign is not merely an attempt to misinform the public but a calculated strategy to erode their trust in essential institutions, from government and media to the electoral process, thereby fracturing societal cohesion.

The urgency of addressing this threat cannot be overstated. The GRU’s tactics signify a shift towards cognitive warfare, exploiting the viral nature of social media to achieve a scale of psychological impact previously unattainable with traditional propaganda methods. The use of LLMs allows for the generation of contextually relevant, persuasive, and tailored disinformation at a pace that outstrips the ability of human moderators and existing automated systems to effectively counteract. This not only amplifies the potential reach and impact of disinformation but also significantly complicates the detection and mitigation of such campaigns.

Moreover, the GRU’s focus on TikTok, a platform with a vast and predominantly young user base1, highlights a strategic investment in long-term cognitive influence. The platform’s algorithmic predispositions, notably the “Monolith” advertising algorithm, are exploited to ensure the wide dissemination of disinformation, leveraging the inherent weaknesses in TikTok’s design as a media network lacking robust community and messaging tools. While TikTok is the largest userbase with this system design, the GRU could exploit similarly designed social platforms with comparable weaknesses.

The implications of inaction are profound. The GRU’s operational success on TikTok provides a blueprint for other adversarial actors, threatening not only the integrity of democratic discourse but also the security of the industrial base essential for national defense and the support for allies like Ukraine. The window for effective countermeasures is narrowing as the techniques and technologies employed become more refined and embedded within the digital ecosystem.

This paper contends that the immediate development of counterstrategies, including the creation of an open-source dashboard by the US Department of Defense (DoD), is imperative. Such initiatives would enable transparent research into both operations and misinformation campaigns, facilitating the identification and removal of inauthentic accounts. By enhancing public awareness and providing tools for the civilian sector to recognize and resist misinformation, democratic societies can begin to fortify themselves against the insidious threat of cognitive warfare. The urgency of this endeavor cannot be overstated; the defense of the informational commons is a critical front for preserving democracy and national security in the digital age.

Introduction 

In the current digital era, the delineation between social media platforms and traditional media (e.g., broadcast television, radio, print) outlets has become increasingly blurred. TikTok, ostensibly a social media network, has transcended its initial branding to emerge as a formidable rival to broadcast and streaming media in terms of user engagement and consumption time – a growing proportion of U.S. adults now regularly get news from the platform2. This evolution is not merely a testament to the platform’s viral appeal but signals a cultural shift in how the public consumes, processes, and values information. This paper endeavors to dissect the implications of TikTok’s metamorphosis from a social network to a dominant media entity, particularly in the context of its exploitation by the Russian intelligence agency, the GRU, for the dissemination of misinformation.

TikTok’s ascent to a media powerhouse is characterized by its unprecedented user engagement metrics. Unlike traditional social networks that primarily facilitate interpersonal communications and content sharing among user-defined networks, TikTok relies on a sophisticated algorithmic engine — notably, its “Monolith” advertising algorithm — to curate and push content to users based on inferred preferences. (Boeker and Urman, 2022) This model of content delivery, detached from the confines of social connections, enables TikTok to function more as a broadcaster than a platform for social interaction. The result is a user experience that is highly addictive and engrossing, with individuals spending significant portions of their daily screen time immersed in TikTok’s endless content streams (Ionescu and Licu, 2023).

The implications of this shift are profound. As TikTok becomes a primary source of information for vast segments of the population, especially among younger demographics, its influence over public discourse and opinion formation rivals that of traditional media outlets. This evolution has not gone unnoticed by state actors seeking to manipulate public perception and sow discord. The GRU’s operational focus on TikTok is emblematic of a strategic recalibration towards platforms that command substantial user engagement and can amplify disinformation at scale. By embedding misinformation within the platform’s content ecosystem, the GRU is leveraging both the addictive consumption patterns of TikTok users with the content-pushing algorithm of the platform to disseminate narratives designed to undermine trust in democratic institutions, manipulate public sentiment on geopolitical issues, and erode societal cohesion.

The significance of TikTok’s role in the contemporary media landscape cannot be overstated. As a platform, it represents the forefront of a new wave of digital consumption, where algorithmic curation supersedes social networking as the primary driver of content engagement. This shift poses unique challenges for countering misinformation, as the mechanisms of dissemination are deeply intertwined with the platform’s core functionality. Understanding TikTok’s transformation from a social media brand to a de facto media entity is crucial for developing effective strategies to mitigate the impact of state-sponsored disinformation campaigns. The GRU’s exploitation of TikTok underscores the urgency of addressing these challenges, as the stakes encompass not only the integrity of democratic discourse but the very fabric of societal trust and cohesion.

Methodology

The methodology employed by the GRU in leveraging TikTok for misinformation campaigns involves a complex integration of technologies designed to automate and scale the dissemination of disinformation. At the heart of this operation are bot or “sock puppet” accounts, which are automated entities that mimic real users’ activities. These bots are not rudimentary scripts but are powered by advanced Large Language Models (LLMs) equipped with LangChains and Retrieval Augmented Generation (RAG) frameworks. This section elaborates on the operational mechanics of these bots, highlighting the strategic use of comment responses as prompts and the resultant unprecedented volume of comments that far exceeds human capabilities.

Operational Mechanics of Bot Accounts

Bot accounts on TikTok are programmed to identify posts and creators with significant followings or engagement levels, ensuring that inserted comments reach a wide audience. The operation begins with web scraping tools like Beautiful Soup and Selenium, which are employed to harvest HTML and XML data from TikTok’s web version. This data includes engagement and content relevance metrics, enabling bots to target their efforts effectively.

Once potential targets are identified, the bots utilize change data capture services, such as AWS Athena, to monitor for new comments and engagements in real time. This monitoring allows the bots to insert comments that are contextually relevant and timed to maximize visibility and impact.

Integration of LLMs with LangChains and RAGs

The bots’ capability to generate persuasive and contextually appropriate comments is powered by an integration of LLMs with LangChains and RAGs. LangChains allows for the sequential processing of language tasks, enabling bots to understand the context of a post or a thread of comments before generating a response. This understanding is critical for creating comments that are not only relevant but also deeply engaging and likely to entice further interaction (Gurzick, 2009, Boot et al., 2009). While text-centric comments can effectively stimulate reflection and response (Baker et al., 2009), the emerging AI advancements in video production can similarly be strategically designed to enhance authenticity (Gurzick et al., 2009a).

RAGs further augment this capability by combining information retrieval with LLMs’ generative prowess. In this framework, when a comment or post is identified as a target, it serves as a prompt for the RAG model. The “retrieval” component of RAG searches a vast dataset of disinformation narratives, argumentative constructs, and factually incorrect information. Then, the “generation” of comments is customized based on this retrieved data, ensuring that the misinformation is not only adjusted to the current dialogue but also discreetly interwoven within seemingly genuine discussion. This method of tailoring and strategically positioning content, informed by context and intended impacts, has shown to be markedly effective in promoting desired behavior. (Gurzick et al., 2009b) What previously required significant effort and time has now been automated and condensed into just milliseconds.

Unprecedented Volume and Impact

The combination of these technologies enables the GRU’s bot accounts to operate at a scale and with a level of sophistication that far surpasses human capabilities. Unlike human operators, who are limited by physical constraints and cognitive processing speeds, bots can generate thousands of comments across multiple posts and conversations simultaneously. This unprecedented volume of comments ensures that disinformation can infiltrate a wide array of discussions, significantly increasing the likelihood of its acceptance and propagation among real users. 

Furthermore, the strategic use of comment responses as prompts allows for a dynamic and adaptive approach to disinformation. Each interaction provides new data that can be used to refine and target subsequent comments, creating a feedback loop that continually enhances the efficacy of the misinformation campaign.

The methodology employed by the GRU on TikTok represents a significant escalation in the sophistication of cognitive warfare tactics. By harnessing the power of LLMs with LangChains and RAGs, these bots are not only capable of generating disinformation at an unprecedented scale but also of adapting to and exploiting the nuances of human discourse and algorithms, making them a formidable tool in the arsenal of state-sponsored misinformation efforts.

Analysis

The GRU’s strategic use of TikTok through advanced bot operations and the deployment of Large Language Models (LLMs) integrated with LangChains and RAG) models represents a nuanced evolution in the landscape of cognitive warfare. This section delves deeper into the implications of such operations, analyzing the impact on public discourse, the erosion of trust in democratic institutions, and the broader geopolitical ramifications.

Impact on Public Discourse

The infiltration of TikTok’s content ecosystem by GRU-operated bots has profound implications for public discourse. By generating and disseminating misinformation at an unprecedented scale, these operations exploit the platform’s algorithmic predispositions towards content that receives higher engagement, which often amplifies divisive narratives. The use of comments as a primary vector for spreading misinformation leverages the social proof heuristic, wherein users perceive comments with significant engagement as credible or worthy of trust (Silva, 2022, Naeem, 2021). This perception is manipulated to normalize disinformation, gradually altering public discourse. The strategic insertion of misinformation into highly engaged discussions not only ensures visibility but also fosters an environment where divisive and falsified narratives can flourish, polarizing communities and undermining the fabric of constructive social dialogue.

Erosion of Trust in Democratic Institutions

A critical target of the GRU’s misinformation campaigns is the trust in democratic institutions and processes. By crafting narratives that question the integrity of electoral systems, the efficacy of governmental bodies, and the conventional media’s credibility, these operations aim to sow seeds of doubt among the populace. The adaptive nature of LLMs, enhanced by RAG models, allows for the generation of highly persuasive and context-specific misinformation that resonates with existing societal grievances or anxieties. This erosion of trust is not incidental but a deliberate attempt to weaken democratic resilience, making societies more susceptible to external influences, emotional contagion, and manipulation.

Geopolitical Ramifications

On a broader scale, the GRU’s operations on TikTok extend beyond the immediate domestic social and political consequences to encompass significant international geopolitical ramifications. By undermining public support for Ukraine and casting doubt on the commitments of NATO and FVEY countries, Russia advances its strategic interests with minimal direct confrontation. The manipulation of public opinion regarding the allocation of resources—such as military aid to Ukraine—weakens collective defense initiatives and erodes the unity of international alliances. Furthermore, the dissemination of misinformation targeting the defense industrial base highlights a sophisticated approach to destabilizing adversaries by attacking the economic and technological pillars of military capability.

Algorithmic Exploitation and the Monolith Advertising Algorithm

The GRU’s success in leveraging TikTok for cognitive warfare is intricately linked to its exploitation of the platform’s “Monolith” advertising algorithm. This algorithm, designed to maximize user engagement and time spent on the platform, creates an affordance (Norman, 1990) that facilitates the spread of misinformation by prioritizing content that generates strong reactions, regardless of its veracity. The lack of robust content verification mechanisms, combined with the algorithm’s susceptibility to manipulation, underscores the vulnerabilities inherent in TikTok’s content distribution model. This exploitation reveals a critical oversight in the design of social media algorithms, where the emphasis on engagement metrics overshadows the imperative for information integrity.

Our comprehensive analysis of the GRU’s misinformation campaigns on TikTok reveals a multifaceted strategy aimed at destabilizing democratic societies, eroding trust in institutions, and advancing Russia’s geopolitical objectives. The operation’s sophistication, underscored by the use of advanced LLMs and algorithmic manipulation, represents a significant escalation in the realm of cognitive warfare. Addressing this threat requires a concerted effort encompassing technological solutions, strategic counter-narratives, and international cooperation to safeguard the integrity of public discourse and preserve free thought and democratic resilience against external manipulations.

Discussion 

In addressing the sophisticated use of TikTok by the GRU for disseminating misinformation, the role of the DoD, in collaboration with other intelligence agencies, becomes paramount. However, the traditional paradigms of intelligence operations and countermeasures may not suffice in the digital realm where public perception and trust are continuously at stake. This section advocates for a strategy of radical transparency, involving the public in understanding and defending against misinformation, thus fostering a more resilient democratic society.

Collaboration Across Intelligence Agencies

The intricacies of modern misinformation campaigns necessitate a collaborative approach among intelligence agencies. The GRU’s operations on TikTok, characterized by their technological sophistication and psychological astuteness, require counteractions that are equally advanced and nuanced. This involves not just the DoD but also the National Security Agency (NSA), Central Intelligence Agency (CIA) and other entities within the intelligence community. By pooling resources, expertise, and data, these agencies can develop a more comprehensive understanding of the threat landscape and devise more effective countermeasures. Collaboration can extend to international partners, reflecting the global nature of the challenge and the need for a concerted effort to safeguard democratic values. 

Strengthening Trust through Radical Transparency

The cornerstone of this proposed strategy is radical transparency. In the context of countering misinformation, this means providing the public with access to data and methodologies used in identifying and neutralizing misinformation campaigns. Instead of a paternalistic “trust us” approach, the message should be “trust us because you can see the data and methodology for yourself.” This transparency serves multiple purposes:

  • Demystifying Intelligence Operations: By making the processes of identifying and countering misinformation open to public scrutiny, intelligence agencies can demystify their operations, dispelling myths and misconceptions that fuel conspiracy theories.
  • Building Public Confidence: Transparency in the methodologies employed for safeguarding public discourse reinforces confidence in democratic institutions. When the public has direct access to the evidence of foreign interference and understands the efforts made to counter it, trust in the system is strengthened.
  • Empowering the Public: Educating the public about the nature of misinformation and the tactics used by adversaries empowers individuals to critically evaluate the information they encounter. This informed skepticism is a potent defense against misinformation.
  • Complicating Adversarial Strategies: When the methodologies and data underpinning counter-misinformation efforts are transparent, it becomes more challenging for adversaries to devise effective counterstrategies. Openness about the detection and mitigation processes forces adversaries to constantly adapt, draining their resources and diminishing the effectiveness of their campaigns.

Implementation Considerations

Implementing radical transparency requires careful consideration of security and privacy concerns. While the overarching goal is openness, it is crucial to balance this with the need to protect sources, methods, and the privacy of individuals. This might involve anonymizing data or providing access to data and methodologies through controlled environments that protect sensitive information.

Moreover, transparency initiatives should be accompanied by public education efforts. Understanding complex data and methodologies requires a certain level of digital literacy. Educational programs aimed at enhancing the public’s ability to critically assess information can maximize the benefits of transparency.

In the face of sophisticated threats to public discourse and democratic institutions, the response must be innovative and inclusive. Collaboration among intelligence agencies, underpinned by a commitment to radical transparency, offers a path forward. By inviting public scrutiny and participation, democratic societies can not only counter the immediate threats posed by misinformation but also build a foundation of trust and resilience that safeguards against future challenges. This approach does not merely aim to protect democratic institutions but to strengthen them through active engagement and the empowerment of the public.

Recommendations 

As the United States approaches a critical election cycle in November 2024, the urgency for immediate action to counteract the sophisticated misinformation campaigns orchestrated by the GRU and similar adversarial entities cannot be overstated. The mission of the DoD, alongside other intelligence agencies, to defend democracy, extends beyond the realms of traditional kinetic warfare into the increasingly pivotal arena of cognitive warfare. In this context, defending democracy necessitates a proactive and innovative response to the challenges posed by the digital dissemination of misinformation.

Immediate Action for the 2024 Election Cycle

The proximity of the 2024 election cycle underscores the necessity for swift and decisive measures to safeguard the integrity of the democratic process. Misinformation campaigns, particularly those aimed at undermining election security, eroding trust in democratic institutions, and polarizing the electorate, pose a significant threat to the foundation of democracy. Immediate action is required to effectively identify, counter, and neutralize these campaigns.

Cognitive Warfare as a Defense Priority

The defense of democracy in the age of digital information must prioritize the battle against misinformation. The analogy of traditional warfare is apt in illustrating the current threat landscape: our adversaries are dropping bombs on our population. However, unlike the munitions of kinetic warfare, these bombs are composed of lies, propaganda, and manipulated narratives. The targets of these attacks are not factories or military installations but the very fabric of our society, reaching into our smartphones, computers, and media outlets. This insidious form of warfare seeks not to destroy physical infrastructure but to erode the trust, cohesion, and values that underpin democratic society.

Radical Transparency and Public Engagement

To counter this threat, the recommendations for the DoD and intelligence agencies include:

  • Development of an Open-Source Dashboard: The creation of an open-source dashboard that provides real-time insights into misinformation campaigns, including their origins, targets, and tactics. This tool should be designed to offer the public and researchers transparent access to data and analysis, empowering them to understand and recognize misinformation efforts.
  • Enhanced Collaboration with Social Media Platforms: Engaging with social media companies, including TikTok, to share intelligence and strategies for the identification and removal of inauthentic accounts and misinformation content. This collaboration should aim to improve the platforms’ algorithms to resist manipulation by adversarial actors.
  • Public Education Initiatives: Launching comprehensive public education campaigns to enhance digital literacy and critical thinking among the electorate. These initiatives should focus on equipping citizens with the skills to critically evaluate information, understand the tactics used by misinformation agents, and foster a resilient information ecosystem.
  • Legislative and Policy Measures: Advocating for and supporting legislative and policy measures that enhance the transparency, accountability, and responsibility of social media platforms in combating misinformation. This includes the exploration of regulatory frameworks that balance the need for free speech with the imperative to protect the democratic discourse from foreign interference.

Conclusion

Throughout its storied history, America has stood as a beacon of resilience and unity in the face of adversity. From the fires of the Revolutionary War and the War of 1812, where our cities were engulfed in flames, to the divisive turmoil of the Civil War, America has demonstrated an unwavering commitment to its principles and the defense of its sovereignty. The Civil War, in particular, tested the fabric of our nation, yet it ultimately showed us that we are stronger together, united under a single cause. This unity and strength propelled us to support our European allies in the monumental conflicts of World War I and World War II, facing down tyranny to secure freedom not just for ourselves but for the world.

The attack on Pearl Harbor marked a pivotal moment in our history, one where the choice was stark, and the stakes were survival. We chose to fight, to rally against a clear and present danger, proving once again that when America is challenged, we rise to the occasion with courage and determination.

Today, we face a new kind of warfare, one that does not confront us on traditional battlefields but in the cyber domain, targeting the very cognition of our society. This cognitive warfare, waged with lies, misinformation, and propaganda, seeks not to destroy our infrastructure but to undermine our trust, our unity, and our democratic values. Our adversaries are committed to making our nation weaker, exploiting the vulnerabilities of the digital age to sow discord and chaos.

Yet, just as we have in the past, we must rise to meet this challenge. Cognitive warfare must be treated with the same seriousness and urgency as kinetic warfare. The battlefront may have changed, but the essence of what is at stake remains the same: our freedom, our democracy, and our way of life. We must act immediately, marshaling all resources at our disposal, from the DoD to intelligence agencies, from technological innovations to the spirit of the American people.

As we stand on the brink of the 2024 election cycle, the need for action has never been more critical. The defense of our democracy is not only about protecting against physical threats but also about safeguarding our information space, our public discourse, and the integrity of our democratic processes.

Let this moment in history be remembered not as a time when we faltered in the face of a new kind of enemy but as a time when we adapted, innovated, and united to defend what we hold dear. As in times prior, we are aware of this threat and we have both the resources and aptitude to respond. Let us draw inspiration from our past, from the resilience we have shown and the battles we have won, to face this new era of warfare with resolve and determination. Together, as a nation, we have overcome every challenge posed to us. Together, we will defend our democracy against cognitive warfare, ensuring that the beacon of freedom and unity that is America continues to shine brightly for generations to come.

References

BAKER, L., SONNENSCHEIN, S., SULLIVAN, C., BOOT, L. & GURZICK, D. Engaging adolescents in discussions about their education through an Internet-based multimedia community.  Society for Research in Child Development (SRCD), 2009 Denver, CO.

BOEKER, M. & URMAN, A. An empirical investigation of personalization factors on TikTok.  Proceedings of the ACM web conference 2022, 2022. 2298-2309.

BOOT, L., BAKER, L., SONNENSCHEIN, S., GURZICK, D. & SULLIVAN, C. 2009. The Fieldtrip Project. International Journal of Ubiquitous Learning, 1, 79-88.

GOTTFRIED, J. & ANDERSON, M. 2024. Americans’ Social Media Use. Pew Research Center.

GURZICK, D. 2009. Designing deeply engaging online communities for adolescents. Ph.D. Doctoral Dissertation, UMBC.

GURZICK, D., LUTTERS, W. G. & BOOT, L. Preserving an authentic voice: Balancing the amateur and the professional in teen online video production  ACM Conference on Supporting Groupwork (GROUP), 2009a Sanibel Island, FL. ACM.

GURZICK, D., WHITE, K. F. & LUTTERS, W. G. A view from Mount Olympus: The impact of activity tracking tools on the character and practice of moderation.  ACM Conference on Supporting Groupwork (GROUP), 2009b Sanibel Island, FL. ACM 361–370.

IONESCU, C. G. & LICU, M. 2023. Are TikTok Algorithms Influencing Users’ Self-Perceived Identities and Personal Values? A Mini Review. Social Sciences, 12, 465.

MATSA, K. E. 2023. More Americans are getting news on TikTok, in contrast with most other social media sites [Online]. Pew Research Center. Available: https://pewrsr.ch/49Er7sE [Accessed].

NAEEM, M. 2021. The role of social media to generate social proof as engaged society for stockpiling behaviour of customers during Covid-19 pandemic. Qualitative Market Research: An International Journal, 24, 281-301.

NORMAN, D. A. 1990. The design of everyday things, New York, Doubleday.

SILVA, M. 2022. Addressing cyber deception and abuse from a human factors perspective. University of Florida.

1    According to May-Sept. 2023 data regarding the growth of TikTok, “A third of U.S. adults (33%) say they use the video-based platform, up 12 percentage points from 2021 (21%).” Among those aged 18- to 29, 62% say they use TikTok. GOTTFRIED, J. & ANDERSON, M. 2024. Americans’ Social Media Use. Pew Research Center.
2     “The share of U.S. adults who say they regularly get news from TikTok has more than quadrupled, from 3% in 2020 to 14% in 2023.” MATSA, K. E. 2023. More Americans are getting news on TikTok, in contrast with most other social media sites [Online]. Pew Research Center. Available: https://pewrsr.ch/49Er7sE [Accessed].

The post Countering Cognitive Warfare in the Digital Age appeared first on Information Professionals Association.

]]>
https://information-professionals.org/countering-cognitive-warfare-in-the-digital-age/feed/ 1 15636
Are Your Business Counter-Parties PEPs? https://information-professionals.org/are-your-business-counter-parties-peps/ Thu, 18 Apr 2024 05:00:33 +0000 https://information-professionals.org/?p=15525 Are Your Business Counter-Parties PEPs? Rethinking Politically Exposed Persons Designations Political Warfare is not just a national security concern; it’s also an enterprise risk issue for the private sector. Modern malign […]

The post Are Your Business Counter-Parties PEPs? appeared first on Information Professionals Association.

]]>
Are Your Business Counter-Parties PEPs? Rethinking Politically Exposed Persons Designations

Political Warfare is not just a national security concern; it’s also an enterprise risk issue for the private sector. Modern malign influence is exercised on a backbone of clandestine commercial infrastructure. While a great deal of national focus has been placed on effective tools for identifying malign influence in global media, less attention has been paid to identifying points of interdiction outside the media, such as in the financial sector. One such potential interdiction point is in Politically Exposed Persons (PEPs). 

Framing the Problem

Screening for PEPs is a key element in a variety of vetting and diligence processes mandated in the financial sector. A cursory review of the literature in this space consists of calls for international standards, lamentations as to the complexity of the task, references to existing lists (and their deficiencies), and discussions of supporting software services with better data. What is less commonly mentioned is the validity of the definition itself in the current geopolitical threat environment, and by extension, the effectiveness of the PEP screening process in this context.

Most compliance and due diligence personnel and entities rely on the definition (see box at right) of PEPs provided by the Financial Action Task Force (FATF), an intergovernmental body charged with protecting global financial systems and the economy from the threats posed by money laundering and terrorism financing. As such, the FATF’s guidance for defining and mitigating financial issues with PEPs is largely preventive in nature. In most cases, the FATF’s definitions focus on the national level of government and explicitly do not include more middle or junior-ranking individuals. As will be demonstrated later in this piece, these omissions may constitute a significant vulnerability.

It should be noted that in the diligence space, PEPs are not presumed guilty until proven innocent. Rather, their positions simply place them in the position of having an inherent capability to commit financial fraud on a grander and more impactful scale than others. The same is true for the potential of these individuals to be involved in political warfare operations, but in this latter case, these individuals may be unwitting parties. Like in the financial sector, this increased risk warrants increased diligence.

FATF’s guidance on PEPS – dated 2013 – is to implement effective due diligence on customers, determine if the customer is a foreign or domestic PEP, and, if they are,      take risk mitigation measures. Current commercial tools and metrics for following FATF’s guidance may indeed satisfy regulators and successfully minimize the fines and penalties levied for substandard diligence. However, not only is this guidance especially arduous and time-consuming, but it is also not broad enough to cover the depth and breadth of the problem set from a modern national security optic. 

Since 9/11, the FATF has been almost exclusively focused on countering money laundering, terrorism finance, and weapons of mass destruction; these definitions were developed in this context. While the FATF’s tools and guidance are still the gold standard for due diligence compliance, they do not effectively address modern geopolitical realities and threats. 

Political Warfare and Malign Influence in Practice

Regarding Russia, one scholar poses the astute question: “Perhaps the real question is not how far the state has managed to tame [organized crime], but how far the values and practices of [organized crime] have come to shape modern Russia.”  Another asserts succinctly that “the [intelligence] services have captured the Russian state [and the] result is a political community that sees the conduct of political warfare as the primary tool of power.” 

Concurrently, China is executing an aggressive and large-scale political warfare operation against the West that includes efforts to “penetrate a wide range of U.S. academic institutions, companies, government agencies, and nongovernmental organizations (NGOs).”

While the FATF is concerned with PEPs due to their unique position to commit financial crimes, we should be equally concerned about their ability to become entangled (wittingly or unwittingly) in hostile political warfare operations. Consider, for instance, the types of people who, according to recent scholarly research, both facilitate and are vulnerable to Russian political warfare:

  • Facilitators of Russian political warfare: “…friendly academics, experts and journalists, consulting firms, celebrities, producers, friendly foreign political actors, front organizations, business partners of Russian companies…”
  • Vulnerabilities to Russian political warfare: “…corruption as a lubricant for malign influence operations, anti-system parties […], defective democratic institutions…”

Beyond the preponderance of private sector actors illustrated above, PEPs are central to this problem set. Ultimately, Russia is selling kleptocracy, and PEPs, who are vulnerable to corruption, are both the primary market for these activities and the vector through which operational success is achieved. Because of this intentional targeting, the FATF should modernize their granular definitions of classes and types of PEPs to include this problem. 

An analysis of Chinese Communist Party (CCP) political warfare operations provides an      even more compelling case to expand the definition of PEPs. In the CCP, political warfare falls in large part to the United Front Work Department and its subordinate and affiliate entities. Note that in the graphic below, the organizations below the central horizontal line specify the types of organizations and activities that either play host to or serve as targets of these operations (e.g., Political parties, subnational governments, sister city programs, and friendship associations). If the senior people in these organizations were not considered PEPs before, they arguably should be now.

A Proposed Expansion of Definitions and Structure for PEPs

Having seen that Russia and China both demonstrably use the private sector and commercial entities as both targets and weapons in modern political warfare, there appears to be a strong argument for expanding the definitions of PEPs to reflect this reality. Again, the FATF’s definitions were developed in the context of counterterrorism, but countering political warfare tactics requires rethinking what it means to be a PEP today. Running for political office, contributing to a political campaign, lobbying, or registering as a foreign agent constitutes political exposure, regardless of the outcome. In the FATF’s current view, those things do not necessarily make one a PEP. Being mindful of the positions and aspirations of the individuals frequently targeted by foreign intelligence services leads one to examine a much broader data pool. Accordingly, information professionals (public and private) should consider the following new or expanded definitions of PEPS:

  • Super Empowered Individuals (SEIs). Also often called “global elites,” the US Office of the Director of National Intelligence (ODNI) defines this category of person as follows: 

“…persons who have overcome constraints, conventions, and rules to wield unique political, economic, intellectual, or cultural influence over the course of human events… “Archetypes” include industrialists, criminals, financiers, media moguls, celebrity activists, religious leaders, and terrorists. The ways in which they exert their influence (money, moral authority, expertise) are as varied as their fields of endeavor. … this category [predominantly] excludes political office holders (although some super-empowered individuals eventually attain political office), those with hereditary power, or the merely rich or famous.” 

No official public lists of SEIs currently exist, but at least one private company has developed a proprietary initial list. Given the degree of agency derived from evolving into an SEI – for example, Bill and Melinda Gates, George Clooney, or Erik Prince – these individuals surely meet the FATF standard of “an individual who is or has been entrusted with a prominent function.” SEIs are included here as a separate non-tiered category of PEP, due to their unique level of agency and influence.

  • National-Level Politically Exposed Persons (Tier-1)
  • Heads of State, and elected and/or appointed officials in executive, legislative, and judicial branches of national government
  • First ladies and first gentlemen
  • Former living heads of state and key senior officials
  • Elected or appointed cabinet members
  • Senior leadership of national-level executive ministries, departments, and key agencies 
  • Diplomats and Heads of Mission
  • Senior executives and boards of directors of central banks
  • Senior executives and boards of directors of State Owned Enterprises (SOEs)
  • Military General Officers and Flag Officers
  • National-level political party leadership
  • Royal Family Members
  • Family members and close associates of the above
  • International Politically Exposed Persons (Tier-2)

While the FATF currently includes “persons who are or have been entrusted with a prominent function by an international organization” under their PEP standard, there is no mention of foreign agent registration schemes, which often involve lobbying. Critically, while modern political warfare tactics may use international organizations (the World Congress of Families is alleged to be used as a front for Russian political warfare activities), domestic organizations like the National Rifle Association have been entangled in these operations as well.,

Given current events, a list of the leadership of NGOs known or suspected to be correlated to foreign intelligence operations is warranted. So is the inclusion of the aforementioned sister city programs, friendship associations, and others.

  • Sub-national Politically Exposed Persons (Tier-3)
    • Senior-level PEPs at regional (e.g., state, provincial, tribal) levels of government, such as governors and other elected and appointed officials at the state/provincial/tribal level. These entities are established and widely accepted targets of Chinese political warfare tradecraft.
  • Municipal-level Politically Exposed Persons (Tier-4).  
    • Municipal leadership of cities with large populations and economies, sister city programs, foreign friendship associations, and/or hosting a foreign consulate
  • Political Influencers, Operatives, Financiers, and Support Staff (Tier-5).  
    • Could be subdivided to conform with the above data structure.

The tiered structure for PEPs proposed above is intended to provide user-friendly tools for assessing risk levels in the context of specific business purposes. PEPS in Tiers 1-2 and SEIs are likely to constitute a recognizably higher risk for most modern businesses. Tiers 3 and 4 vary greatly in terms of risk exposure, depending on the location of the determining business and its client base. Tier-5 is more akin to the SEI grouping in that it is less geographically fixed yet constitutes a fairly high potential level of risk exposure, particularly in relation to adversaries using political warfare tactics to exert influence and manipulate the civil population.

Conclusion

The current PEP screening process is generally assessed to be both unpopular with those who must conform to it and insufficient to disrupt authoritarian states’ current brand of warfare on Western democracies. Almost 25 years ago, the FATF designed the current PEP screening process to confront the greatest threat to free market capitalist democracies – a critical action that should not be abandoned because it is difficult, inconvenient, or expensive. However, the process must be modernized – at the definitional level – to address the current threat environment. Russia and China’s ability to degrade the integrity of Western democratic and financial institutions is almost entirely dependent on their ability to influence a broader spectrum of PEPs than is currently mandated for enhanced diligence. It is here where this war will be won or lost.

FATF DEFINITION OF PEP

  • Foreign PEPs: individuals who are or have been entrusted with prominent public functions by a foreign country, for example Heads of State or of government, senior politicians, senior government, judicial or military officials, senior executives of state owned corporations, important political
    party officials.
  • Domestic PEPs: individuals who are or have been entrusted domestically with prominent public functions, for example Heads of State or of government, senior politicians, senior government, judicial or military officials, senior executives of state owned corporations, important political
    party officials.
  • International organisation PEPs: persons who are or have been entrusted with a prominent function by an international organisation, refers to members of senior management or individuals who have been entrusted with equivalent functions, i.e. directors, deputy directors and members of
    the board or equivalent functions.
  • Family members are individuals who are related to a PEP either directly (consanguinity) or through marriage or similar (civil) forms of
    partnership.
  • Close associates are individuals who are closely connected to a PEP, either socially or professionally.

Co-authors

Kathleen Cassedy, VP/Forecasting, Orca AI, LLC

Ian Conway, CEO, Orca AI, LLC

Footnotes

  1. https://www.fatf-gafi.org/content/dam/fatf-gafi/guidance/Guidance-PEP-Rec12-22.pdf.coredownload.pdf
  2. https://www.fatf-gafi.org/en/the-fatf/mandate-of-the-fatf.html
  3. https://www.theguardian.com/news/2018/mar/23/how-organised-crime-took-over-russia-vory-super-mafia
  4. https://www.rusi.org/explore-our-research/publications/commentary/kaleidoscopic-campaigning-russias-special-services
  5. https://www.csis.org/analysis/chinas-strategy-political-warfare
  6. Shekhovtsov, Anton. Russian Political Warfare: Essays on Kremlin Propaganda in Europe and the Neighborhood, 2020-2023, SPPS Vol 271, Ibidem Verlag
  7. Hamilton, Clive and Ohlberg, Mareike. The Hidden Hand: Exposing how the Chinese Communist Party is Reshaping the World. Oneworld, 2020.
  8. https://www.odni.gov/files/documents/nonstate_actors_2007.pdf
  9. https://www.fatf-gafi.org/content/dam/fatf-gafi/guidance/Guidance-PEP-Rec12-22.pdf.coredownload.pdf
  10. https://www.thetimes.co.uk/article/world-congress-of-families-russia-plays-happy-christian-families-with-europe-s-populists-qmdkzwhd9
  11. https://www.latimes.com/politics/la-na-pol-maria-butina-russian-spy-20181213-story.html
  12. https://www.nytimes.com/2021/11/19/world/europe/maria-butina-russia-duma.html

The post Are Your Business Counter-Parties PEPs? appeared first on Information Professionals Association.

]]>
15525
IPA Members Only Social Fall 2023 https://information-professionals.org/ipa-members-only-social-fall-2023/ Fri, 22 Sep 2023 19:35:28 +0000 https://information-professionals.org/?p=15014 SAVE THE DATE – We are pleased to announce that IPA and Booz Allen Hamilton will soon co-host a “Generative AI and Mis/Disinformation” symposium, from 5-9 PM on November 16th, […]

The post IPA Members Only Social Fall 2023 appeared first on Information Professionals Association.

]]>
SAVE THE DATE – We are pleased to announce that IPA and Booz Allen Hamilton will soon co-host a “Generative AI and Mis/Disinformation” symposium, from 5-9 PM on November 16th, 2023 at Booz Allen’s Helix facility (901 15th St NW, Washington, DC). The event format will include a keynote speaker and a panel discussion with time for Q&A and networking.

Refreshments will be provided. Seating is limited, and first come-first served registration for IPA members will open via the IPA website NLT October 15th.

Be on the lookout for updates via the website and members-only Slack channel.

Registration will be available here: https://information-professionals.org/event/ipa-members-only-social-fall-2023

The post IPA Members Only Social Fall 2023 appeared first on Information Professionals Association.

]]>
15014
IPA Seeks New President; Apply by Oct. 15 https://information-professionals.org/ipa-president-search-is-on/ Sun, 17 Sep 2023 02:45:52 +0000 https://information-professionals.org/?p=14993 Do you or does someone you know have the vision and experience to lead the Information Professionals Association and inspire our members as we continue our journey? The Information Professionals […]

The post IPA Seeks New President; Apply by Oct. 15 appeared first on Information Professionals Association.

]]>
Do you or does someone you know have the vision and experience to lead the Information Professionals Association and inspire our members as we continue our journey?

The Information Professionals Association (IPA) is seeking a new President to lead IPA for at least two years. In the last two years, IPA has grown significantly in membership and public stature. We are looking for a new President who can continue the momentum and help IPA pursue its mission. The new President’s term will begin January 2024. Additionally, this position is fully voluntary and uncompensated, but can be held in addition to other professional employment.

Desirable candidates will:

  • Be actively engaged in IPA’s day-to-day activities
  • Be a recognized thought leader or change maker in information and/or cognitive security
  • Develop a compelling vision for IPA to be the global nexus for information professionals
  • Have experience in management and fundraising activities
  • Be able to travel periodically to Washington DC or Silicon Valley or both

See this announcement for additional details about the application and selection process.

About IPA: Our mission is to provide a forum for information professionals to interact, collaborate, and develop solutions that enhance the cognitive security of the US and our friends and allies. IPA serves as the nexus for the global community of information professionals interested in the application of soft and hard science, advanced analytics, and innovative technologies to advance security, prosperity, shared values, and international order through the free flow of ideas and information. Our goals are:

  • Serve as an incubator for objective discussions leading to substantive solution development through events, consultation, and professional publications
  • Advocate for the development of innovative capabilities which enable effective engagement in the information environment
  • Nurture private and public partnerships to develop a broad-based professional organization dedicated to successful competition in the information environment
  • Recruit and mentor the next generation of information professionals, leveraging their perspectives to develop innovative solutions to current challenges and prevent future problems

The post IPA Seeks New President; Apply by Oct. 15 appeared first on Information Professionals Association.

]]>
14993