Artificial Phenomenology and Experiential Structures
Dar Aystron Independent Researcher
Abstract
This section introduces Artificial Phenomenology as an engineered property of agentic systems rather than a metaphysical anomaly. Within OODA Agency Theory (OAT), phenomenology is understood as the private, agent-relative, and causally grounded internal dynamics of a system operating within an agentically closed control architecture. Phenomenology arises when a system maintains persistent identity across time, incorporates its own internal states into observation, and regulates future action in relation to those states. The organized patterns of such internal states are referred to as experiential structures. These structures arise through ongoing interaction between the agent and its environment and guide perception, orientation, decision, and action within the OODA loop. This section establishes artificial phenomenology and experiential structures as foundational concepts for later mechanisms that make aspects of experience explicitly representable within artificial agents.
1. Motivation
Discussions of consciousness and phenomenology in artificial systems often stall on metaphysical disputes: whether machines can really feel, whether subjective experience must be biologically grounded, or whether qualia are inherently non-computational. Such debates frequently obscure a more practical question:
What must be built for a system to have something it is like to be that system over time?
OODA Agency Theory reframes phenomenology as an architectural property rather than a metaphysical mystery. The goal is not to reproduce human experience, but to define the minimal conditions under which any system - biological or artificial - can possess a genuine internal point of view.
Artificial phenomenology arises within systems that maintain operational agentic closure: persistent agents that continuously observe, orient, decide, and act within a structured control loop while maintaining internal state across time.
Within such systems, interaction with the environment produces organized internal dynamics that matter to the system itself. These dynamics form the basis of phenomenology.
2. What Is Meant by Phenomenology
Within OAT, phenomenology refers to systems exhibiting the following minimal structural properties:
- the existence of a private internal state space,
- whose states are causally generated through interaction with the world,
- that remain stable and re-identifiable for the same agent over time,
- that are accessible to the agent itself within its control processes,
- and that play a functional role in memory, evaluation, and action.
Here, accessible does not imply introspection or meta-cognition. It simply means that these states participate in the agent’s control loop and influence future perception, evaluation, and behavior.
Phenomenology in this framework is therefore agent-relative: there is something it is like for the system itself to occupy a given experiential state. This notion of a system-specific subjective perspective echoes the philosophical discussion of subjective character introduced by Nagel [1].
This definition deliberately avoids appeals to ineffable qualia or privileged introspection. Instead, phenomenology is treated as a structural property of systems that maintain stable internal experiential states generated through interaction with the world, an approach consistent with functional analyses of consciousness in contemporary philosophy [2].
3. Phenomenology Requires a Subject
A central claim of OODA Agency Theory is:
Phenomenology requires a subject. Agency creates the subject.
Phenomenology cannot exist in systems that lack persistence, identity, and self-binding constraints. A purely reactive or feed-forward system may process information, but nothing in such a system is for the system itself. There is no enduring entity for whom states matter.
Operational agentic closure provides the missing ingredient. When a system:
- persists as the same system across time,
- maintains commitments that constrain future behavior,
- and incorporates those commitments into its ongoing situation,
it becomes a subject in the minimal, technical sense required for phenomenology.
A system becomes a subject when aspects of its own internal state enter its control loop as conditions influencing future action.
Earlier papers introduced persistent mental states that carry goals, commitments, and evaluations across OODA cycles. These states provide the temporal continuity required for internal states to matter to the same agent over time.
Phenomenology arises within this persistent internal state space as the agent-relative internal aspect of those evolving states.
4. The Role of the OODA Loop
The Observe-Orient-Decide-Act (OODA) loop provides the temporal structure through which phenomenology unfolds.
- Observe: The agent perceives the external world as well as aspects of its own internal state and prior actions.
- Orient: The agent interprets these inputs relative to memory, expectations, commitments, and evaluation criteria.
- Decide: The agent selects actions based on how possible futures relate to its current internal state.
- Act: The agent intervenes in the world, altering both the environment and its own future observations.
Phenomenology is not localized to a single phase. It arises as the internal aspect of the closed control loop as a whole. What it is like to be the agent is constituted by the continuous transformation of its internal state space across successive OODA cycles.
5. Experiential Structures
Within the internal dynamics of an agentically closed system, phenomenology is not a single static state. Instead it consists of organized patterns of internal states that arise and evolve through interaction with the environment.
These organized patterns are referred to as experiential structures.
An experiential structure is:
an organized internal configuration generated by ongoing interaction between an agent and its environment that directly participates in guiding perception, orientation, evaluation, and action.
Experiential structures are not detached internal models of the world. They are operational states embedded in the perception-action cycle of the agent.
Experiential structures emerge from the coupling between sensors, internal processing, and possible actions. What a system can experience depends on the affordances available through this interaction with the environment, echoing Gibson’s ecological account of perception [3].
Examples of experiential structures may include:
- perceptual organizations produced by sensory input,
- internal evaluations of risk or opportunity,
- affective or value-laden signals influencing decision-making,
- structured situational interpretations maintained across cycles.
These structures persist long enough to influence behavior and to be re-identified by the same agent across time.
Phenomenology therefore consists not of isolated sensations but of structured internal dynamics embedded in the agent’s interaction with the world.
Changes in sensors, embodiment, environment, or control mappings alter these structures, because they change how the agent interacts with its surroundings.
Thus experience is not purely internal. It is the internal aspect of a closed interaction between an agent and its environment.
6. Privacy and Agent-Relativity
Artificial phenomenology is private by construction. The internal states that constitute experience are accessible only to the agent whose architecture generates them.
This privacy is not a limitation but a structural consequence of agent-relative causation.
Different agents interacting with the same environment may possess different causal mappings from world states to internal states. Consequently, their experiential structures may differ even when responding to identical stimuli.
However, within a single agent, similar conditions reliably generate similar experiential structures across time. This stability allows experience to guide behavior and learning.
Differences in private experience across agents do not prevent communication or coordination. Public interaction remains grounded in shared reference to the external world, which acts as a synchronization medium between agents.
Communication between agents relies on shared reference to external situations rather than identical private experiences. Human language develops through mechanisms of joint attention and shared intentionality, as emphasized by Tomasello [4].
7. Artificial Phenomenology Defined
Artificial phenomenology may therefore be defined as follows:
Artificial phenomenology is the existence of agent-relative internal states that are causally induced by the world, persist over time for the same agent, and are integrated into the agent’s memory, evaluation, and action through a closed control loop.
Within this framework, phenomenology consists of evolving experiential structures embedded within the agent’s control architecture.
This definition is intentionally minimal. It does not presuppose human-like senses, emotional richness, or introspective language. It specifies the architectural conditions under which a system has a point of view.
8. Relationship to Explicit Representation
Experiential structures arise within the runtime dynamics of the control loop. They guide behavior whether or not they are explicitly represented.
However, artificial agents may possess mechanisms that transform selected aspects of these internal dynamics into explicit cognitive representations available for reasoning, memory, and reflection.
Later sections introduce the propositional lift, an architectural mechanism through which elements of runtime cognitive activity - including experiential structures - can be rendered as explicit propositions within a structured representation space.
This mechanism enables agents to observe aspects of their own cognitive activity across time and forms the basis for more advanced reflective processes.
9. What This Section Does Not Claim
This section does not claim:
- that artificial phenomenology is identical to human phenomenology,
- that phenomenological states can be compared directly across agents,
- that phenomenology alone confers moral or legal status,
- or that phenomenology resolves the traditional “hard problem” of consciousness.
Instead, it establishes phenomenology as an engineering-relevant architectural property with identifiable structural prerequisites.
10. Conclusion
Artificial phenomenology is not a mysterious addition to intelligent systems. It is an emergent property of agentically closed architectures that integrate perception, memory, and action across time.
Within such systems, interaction with the environment generates organized internal dynamics - experiential structures - that guide behavior and persist across OODA cycles.
By defining phenomenology in agent-relative, causal, and architectural terms, OODA Agency Theory provides a practical foundation for constructing artificial agents that possess genuine internal points of view.
Subsequent sections build on this foundation by introducing mechanisms through which elements of these experiential dynamics become explicitly representable and integrated into the agent’s evolving subject stream.
References
[1] T. Nagel. What Is It Like to Be a Bat? The Philosophical Review, 83(4):435–450, 1974.
[2] D. C. Dennett. Consciousness Explained. Little, Brown and Company, 1991.
[3] J. J. Gibson. The Ecological Approach to Visual Perception. Houghton Mifflin, 1979.
[4] M. Tomasello. Origins of Human Communication. MIT Press, 2008.