predictionerrors.com — Position Paper

The Large Accounting Model: A Unified Ledger for Money and Knowledge

Spencer Nash
Chartered Accountant · Master Black Belt: Financial, Process & System Development
predictionerrors.com
February 2026
Abstract

This paper proposes a unified economic infrastructure that tracks two forms of accumulation on one ledger: money and knowledge. Period Entry accounting eliminates the general ledger. Cryptographic co-signature by four parties — customer, supplier, customer's bank, supplier's bank — eliminates bank, customer, and supplier reconciliation. Every transaction is proven at source. But the ledger records more than pounds. It records learning: Competence and its reliability (Status), Collaboration and its reliability (Belonging), Curiosity and its reliability (Curiosity). A network of content providers and human and AI tutors, operating through Oxford-style tutorials where tutoring and assessment are the same act, builds the knowledge side of the ledger. The network is unified by a dedicated currency — Spennies (study pennies) — issued by a Bank of Learning and backed by the accumulated knowledge of the network, with a presale to early-stage developers and users funding development. The economic drivers — market size, share, price, cost, capital structure — are explicit dimensions within each Period Entry, not hidden inside a black box. Beta and beta reliability emerge from the channel structure. The result is a transparent model of the world where knowledge accumulation preempts economic growth, symmetric information increases economic efficiency, and transparency removes corruption. This is the Large Accounting Model.

1. One Ledger, Two Accumulations

Accounting has always tracked one thing: money. Revenue, cost, profit, loss, assets, liabilities — all denominated in currency. The entire infrastructure of financial reporting, from the general ledger to the balance sheet, is an apparatus for recording the accumulation and movement of monetary value.

But money is the second accumulation, not the first. Before a company can generate revenue it must have knowledge — knowledge of its market, its product, its processes, its customers. Before an individual can earn income they must have competence. Knowledge precedes money. It is the precondition for economic activity, not a byproduct of it.

Yet there is no ledger for knowledge. No standardised way to record what a person knows, how reliably they know it, how well they collaborate, or how actively they seek to learn. Universities issue credentials. Employers write references. Professional bodies grant certifications. But none of these is a ledger. None accumulates over time. None decomposes knowledge into measurable channels. None records reliability.

The Large Accounting Model tracks both accumulations on one ledger. Monetary value is recorded in pounds. Knowledge is recorded in Competence, Collaboration, and Curiosity — each with its own reliability score. A company's value is not just its financial capital. It is its financial capital plus the accumulated, reliability-weighted knowledge of its people.

2. The End of Reconciliation

Traditional accounting requires reconciliation at every boundary. The company's books must reconcile to the bank's books. The customer's records must reconcile to the supplier's records. The subsidiary's accounts must reconcile to the parent's. This reconciliation effort is enormous — consuming significant accounting resource across every organisation in the economy — and exists for one reason: each party maintains separate records of the same transaction.

2.1 Period Entry: No General Ledger Reconciliation

Period Entry accounting eliminates the general ledger entirely. By recording each transaction with its economic lifespan — start date, end date, value, payments — the system derives the P&L, balance sheet, and cash flow for any reporting period directly from the transaction data. There is no month-end close. There are no adjusting journals. There is no trial balance to reconcile. The accrual — work done minus payments made — is computed on the fly.

2.2 Co-Signature: No Bank or Counterparty Reconciliation

Co-signature eliminates the remaining reconciliations. When four parties — customer, supplier, customer's bank, supplier's bank — cryptographically co-sign a single shared transaction value, the transaction is proven at source. There is nothing to reconcile because there was never a separate record to reconcile against.

Transaction: Customer pays Supplier £10,000 1. Customer signs: "I paid £10,000 to Supplier" 2. Supplier signs: "I received £10,000 from Customer" 3. Customer's bank signs: "I debited £10,000 from Customer" 4. Supplier's bank signs: "I credited £10,000 to Supplier" Result: Four signatures on one record. No reconciliation. Proven at source.

This is not blockchain. Business payments are identified — you need to know who paid you. When all parties are known, consensus is four signatures, not thousands of nodes. The system operates at the speed and cost of email.

2.3 The Complete Elimination

Reconciliation Type Traditional Accounting Large Accounting Model
General ledger Trial balance, adjusting journals, month-end close Eliminated by Period Entry
Bank Bank reconciliation statement Eliminated by co-signature
Customer Accounts receivable confirmation, statement reconciliation Eliminated by co-signature
Supplier Supplier statement reconciliation, three-way matching Eliminated by co-signature
Intercompany Intercompany matching and elimination Eliminated by co-signature
Every form of accounting reconciliation is eliminated. Not reduced, not automated — eliminated. The transaction is recorded once, signed by all parties, with its economic lifespan intact. Reports are derived. Nothing needs to agree because nothing was ever separate.

3. Two Shared Values

Every co-signed transaction carries two values, not one.

3.1 Monetary Value

The first value is monetary: the amount in pounds, dollars, euros. This is conventional accounting. Period Entry records the monetary value with its economic lifespan. Co-signature proves it. Nothing new here except the elimination of reconciliation.

3.2 Knowledge Value

The second value records the knowledge exchanged or demonstrated in the transaction. Knowledge is tracked across three channels, each mapped to an ECF emotional channel:

Knowledge Channel ECF Channel Measures Scale
Competence S Status What you know and how well you know it −10 to +10 plus Reliability (0 to 1)
Collaboration B Belonging How well you work with others −10 to +10 plus Reliability (0 to 1)
Curiosity C Curiosity How actively you seek to learn −10 to +10 plus Reliability (0 to 1)

Each knowledge channel carries two numbers: the magnitude (how much) and the reliability (how trustworthy that assessment is). This implements Friston's precision concept: not just a score, but a score weighted by confidence. A Competence of +7 with reliability 0.9 is fundamentally different from a Competence of +7 with reliability 0.2.

Negative Competence is not ignorance — it is faked knowledge. A person who claims expertise they do not have scores negative on Competence. The system distinguishes between "doesn't know" (low magnitude, low reliability — Confusion) and "pretends to know" (negative magnitude — active misrepresentation).

The knowledge ledger is co-signed like the monetary ledger. When a tutor assesses a student, the assessment is co-signed by the tutor and the student — and potentially by the institution and the verifying body. Competence is not self-declared. It is proven at source, exactly as a payment is proven at source. The four-party co-signature model applies to knowledge transactions just as it applies to financial transactions.

4. The Learning Network

If knowledge accumulates on a ledger, it must be generated somewhere. The Large Accounting Model creates a network of learning: content providers, human tutors, and AI tutors, operating through a specific pedagogical model.

4.1 Oxford-Style Tutorials

The tutorial model is the Oxford tutorial: one tutor, one or two students, intensive dialogue. This format has a unique property that makes it essential to the system: tutoring and assessment are the same act.

In a lecture, the lecturer teaches and a separate examiner assesses. These are different acts by different people at different times. In a tutorial, the tutor asks questions, listens to answers, probes understanding, identifies gaps, and responds — all in real time. The tutor knows, by the end of the session, exactly what the student understands and exactly where the gaps are. No separate assessment is required. The tutorial is the assessment.

This is how oral examination works. The examiner asks. The student responds. The examiner probes. Understanding is tested in the interaction, not in a written product submitted later. The key test of Competence reliability is: can you explain it? This tests clarity rather than magnitude. A student who scores +7 on Competence magnitude but cannot explain their reasoning has low reliability. A student who scores +5 but can explain every step has high reliability. The oral exam tests both.

4.2 Tutors: Human and AI

The network includes both human and AI tutors. Both operate through the same tutorial format. Both assess in real time. Both co-sign knowledge transactions with the student.

Role Function Co-Signs
Content provider Creates learning material — courses, readings, problems Content quality assessment
Human tutor Oxford-style tutorial — teaches and assesses simultaneously Student Competence, Collaboration, Curiosity
AI tutor Tutorial dialogue — probes understanding, identifies gaps Student Competence, Collaboration, Curiosity
Student Learns, demonstrates understanding, asks questions Tutor quality, Content quality

The co-signature is bidirectional. The tutor assesses the student. The student assesses the tutor. Both assessments accumulate on both parties' ledgers. A tutor whose students consistently score low Collaboration is producing a signal about the tutor, not just the students. The system learns about everyone.

4.3 Reliability Through Accumulation

A single tutorial is one data point. Reliability is low. But as a student accumulates tutorial assessments from multiple tutors across multiple subjects, the knowledge ledger builds reliability through the same four variables that drive all reliability in the system: volatility (how consistent are the assessments?), age (how long has this person been learning?), sample size (how many assessments?), and trend (improving, declining, or stable?).

This replaces the university credential. A degree is a binary signal: you have it or you don't. The knowledge ledger is a continuous, multi-dimensional, reliability-weighted signal that shows exactly what you know, how reliably you know it, how well you work with others, and how actively you learn. It does not expire. It does not require institutional validation. It accumulates from every co-signed learning interaction across a lifetime.

5. Spennies: The Currency of Learning

A learning network needs a medium of exchange. Students pay tutors. Tutors pay content providers. Content providers invest in new material. If this network runs on conventional currency alone it inherits every friction of the existing financial system — and worse, it ties the value of learning to monetary wealth. A student without money cannot learn. A tutor in a low-income country cannot charge what a tutor in London charges. The network fragments along existing economic lines.

The solution is a dedicated currency: Spennies — study pennies.

5.1 The Bank of Learning

Spennies are issued by a new institution: the Bank of Learning. This is not a metaphor. It is a bank in the accounting sense — it holds deposits, processes transactions, and maintains a ledger. But its ledger is the unified ledger of the Large Accounting Model, tracking both monetary value and knowledge value. The Bank of Learning is the fourth co-signing party for all knowledge transactions on the network.

Co-Signing Party Role in Knowledge Transaction
Student Receives learning, pays Spennies, co-signs assessment
Tutor (human or AI) Delivers tutorial, receives Spennies, co-signs Competence assessment
Content provider Provides material, receives Spennies from tutor or student
Bank of Learning Issues Spennies, processes transactions, maintains the ledger

Every knowledge transaction is co-signed by the student, the tutor, and the Bank of Learning — the same four-party architecture as a monetary payment (customer, supplier, customer's bank, supplier's bank), applied to learning.

5.2 What Spennies Buy

Spennies are spent on learning and earned by teaching:

Action Spenny Flow
Student takes a tutorial Student pays Spennies to tutor
Student accesses content Student pays Spennies to content provider
Tutor delivers a tutorial Tutor earns Spennies from student
Content provider creates material Provider earns Spennies from students and tutors
Peer teaches peer Teaching student earns Spennies (co-signed Collaboration uplift)

Spennies circulate within the learning network. A student spends Spennies to learn. The tutor spends Spennies on content to teach better. The content provider spends Spennies on tutorials to learn what students need. The currency flows through the network, carrying value with it — and every transaction is co-signed, accumulating both the monetary record and the knowledge record on the unified ledger.

5.3 Spenny Value Is Backed by Knowledge

Fiat currency is backed by the productive capacity of a nation. Spennies are backed by the knowledge capacity of the network. As the network grows — more tutors, more students, more content, more co-signed assessments — the total Competence, Collaboration, and Curiosity on the ledger increases. This accumulated knowledge is the productive asset that gives Spennies their value.

This is not speculative. It is measurable. The knowledge ledger records exactly how much Competence exists in the network, with what reliability, in which domains, trending in which direction. The value backing Spennies is not a promise. It is a co-signed, reliability-weighted ledger entry.

5.4 Funding Development: The Presale

Building the Large Accounting Model — the platform, the co-signature infrastructure, the tutorial network, the AI tutors, the Bank of Learning itself — requires development capital. Spennies provide the funding mechanism.

A defined quantity of Spennies are presold to early-stage developers and users before the network launches. These early participants receive Spennies at a founding rate. As the network grows and the knowledge backing increases, the value of Spennies appreciates — because the productive capacity of the network has grown.

This is not an ICO. ICOs sell tokens backed by nothing except speculation on future demand. Spennies are backed by co-signed knowledge accumulation — a measurable, auditable, reliability-weighted asset. The presale is closer to buying equity in a productive enterprise than to buying a speculative token. Early participants fund the infrastructure and receive currency whose value grows as the network's knowledge grows.

Early-stage developers who build the platform earn Spennies for their contribution — a knowledge transaction co-signed by the Bank of Learning. Early-stage tutors who establish the tutorial network earn Spennies at the founding rate. Early-stage students who stress-test the system and provide feedback earn Spennies for their Collaboration. Everyone who participates in building the network accumulates both Spennies (monetary) and knowledge scores (Competence, Collaboration, Curiosity) on the same ledger.

5.5 Exchange

Spennies are exchangeable for conventional currency at a rate determined by the market. As the knowledge network grows, demand for Spennies grows — because Spennies are the medium of exchange for learning on the network. The exchange rate is not arbitrary. It reflects the economic value of the accumulated knowledge the network has produced. A network with a million co-signed, high-reliability Competence assessments has produced something genuinely valuable — and the currency that circulates within it reflects that value.

6. The Economic Model: Explicit Dimensions

The One Computation paper demonstrates that financial analysis and emotional computation are structurally identical — the same five channels, the same prediction error, the same reliability function, the same two outputs (result and reliability of the result). The Large Accounting Model makes this operational.

6.1 Drivers Inside the Period Entry

Every Period Entry carries not just its monetary value but the explicit economic drivers that produced that value. The five sub-channels of Return decompose every transaction:

Driver What It Records Primary Channels
Market size Total addressable opportunity R Resource, C Curiosity
Market share Company's portion of opportunity S Status, B Belonging
Price What the company charges S Status, V Values, C Curiosity
Cost What the company spends to deliver R Resource, B Belonging
Capital structure How the company funds itself R Resource, S Status, V Values

This is not a black box. Every transaction is tagged with the dimensions that drive it. When revenue falls, the system shows whether it fell because the market shrank (Resource), because the company lost share (Status and Belonging), because pricing weakened (Status and Values), or because costs rose (Resource and Belonging). The decomposition is explicit. The dimensions are visible. The reliability of each dimension is computed.

6.2 Beta and Beta Reliability

Traditional beta — the Capital Asset Pricing Model's measure of systematic risk — is a single opaque number. The Large Accounting Model produces a decomposed beta with reliability.

β = f(βResource, βStatus, βBelonging, βValues, βCuriosity)
βreliability = f(volatility, age, sample_size, trend) per channel
Decomposed beta with per-channel reliability

When beta changes, the system shows which channel drove the change and whether the shift is a real structural change (high reliability) or noise (low reliability). A beta of 1.5 with high reliability across all contributing channels is a fundamentally different proposition from a beta of 1.5 where the key channels are volatile and the data is stale. Traditional CAPM treats them identically. The Large Accounting Model does not.

6.3 The Period Entry as Economic Model

Each Period Entry, with its monetary value, its economic lifespan, its explicit drivers, and its knowledge co-signatures, is a self-contained economic model. The collection of all Period Entries across all co-signing parties is a model of the economy. Not a statistical model built from sampled data. Not a black box neural network trained on historical prices. A transparent, decomposed, reliability-weighted model built from the actual transactions of every participant.

The ledger IS the model. There is no separate forecasting system. There is no separate risk model. There is no separate planning tool. The Period Entries contain the drivers. The drivers contain the dimensions. The dimensions contain the reliability. The forecast extends from the transaction end dates. The model is the data.

7. Knowledge Accumulation Preempts Economic Growth

This is the central economic claim. In the current system, knowledge is invisible to the economy. A company invests in training. The expense appears on the P&L. The knowledge does not appear on the balance sheet. The investment looks like a cost, not an asset. When the trained employee generates more revenue, the connection between the training investment and the revenue outcome is invisible — buried in opaque aggregate numbers.

In the Large Accounting Model, knowledge accumulation is visible. When a company's employees accumulate Competence through co-signed learning interactions, that accumulation appears on the knowledge ledger. When that Competence drives innovation, and the innovation drives revenue, the causal chain is traceable: learning transaction → Competence increase → product improvement → revenue increase. Each step is co-signed. Each step carries reliability.

7.1 Leading Indicator

Monetary results are lagging indicators. Revenue is reported after it is earned. Profit is calculated after costs are incurred. By the time a financial problem appears in the accounts, the underlying cause is months or years old.

Knowledge accumulation is a leading indicator. Rising Competence across a company's workforce signals future capability improvement. Rising Curiosity signals future innovation. Rising Collaboration signals future coordination efficiency. These signals appear in the knowledge ledger before they appear in the financial ledger — because knowledge must accumulate before it can be monetised.

Knowledge accumulation predicts economic growth. A sector where Competence and Curiosity are rising across the workforce will generate economic growth — it is a mathematical inevitability that more capable, more curious people produce more value. The Large Accounting Model makes this visible in real time, before the financial results confirm it.

7.2 Investment Becomes Measurable

Currently, training expenditure is a cost with no measurable return. The Large Accounting Model turns it into a measurable investment. The cost appears on the monetary ledger. The knowledge return appears on the knowledge ledger. The reliability of both is computed. For the first time, a company can answer: did this training investment actually increase Competence? With what reliability? And did that Competence increase translate into financial return?

8. Symmetric Information and Economic Efficiency

Information asymmetry is the source of most economic inefficiency. The seller knows more about the product than the buyer. The manager knows more about the company than the investor. The borrower knows more about their creditworthiness than the lender. Every market failure traced to imperfect information — adverse selection, moral hazard, agency costs — is a consequence of asymmetry.

8.1 The Co-Signature Effect

Co-signed transactions are inherently symmetric. Both parties sign the same value. Both parties see the same record. Both parties' banks verify the same amount. There is no version of the transaction that one party sees and the other does not.

This extends to the knowledge ledger. A supplier's Competence score is visible to the customer. A company's Collaboration score is visible to its partners. The reliability of these scores is visible to everyone. You do not need to trust a supplier's claim of expertise — you can inspect their co-signed knowledge ledger and assess the reliability yourself.

8.2 The Efficiency Gains

Inefficiency Caused By Resolved By
Adverse selection Buyer cannot assess quality before purchase Supplier's co-signed Competence and reliability are visible
Moral hazard Agent's behaviour is unobservable by principal All transactions co-signed — behaviour is recorded
Agency costs Manager's interests diverge from owner's Transparent decomposed drivers — no hidden decisions
Search costs Finding competent suppliers/employees is expensive Knowledge ledger makes Competence searchable
Credit rationing Lender cannot assess borrower risk Decomposed beta with reliability — risk is transparent
Market pricing errors Investors have incomplete information Explicit drivers with reliability — valuation is decomposed

Each of these inefficiencies imposes a real economic cost — higher prices, worse allocation of capital, underinvestment, overinvestment, fraud. Symmetric information, achieved through co-signed transactions with visible reliability, reduces every one of them.

9. Transparency Removes Corruption

Corruption requires opacity. A bribe is an off-ledger transaction. Fraud is a falsified record. Embezzlement is a hidden transfer. Tax evasion is an unreported income stream. Every form of financial corruption depends on the ability to maintain records that differ from reality.

9.1 Co-Signature Prevents Unilateral Falsification

In a co-signed system, altering a transaction requires compromising four independent parties. The customer, the supplier, and both banks would all need to collude or be hacked. A unilateral falsification — changing the amount, the date, the payee — breaks the signature chain and is immediately detectable.

9.2 Visible Drivers Prevent Hidden Manipulation

In traditional accounting, manipulation is easy because the drivers are hidden inside aggregate numbers. Revenue can be inflated by channel-stuffing. Costs can be capitalised inappropriately. Related-party transactions can be disguised. The aggregate numbers look fine while the underlying reality is distorted.

In the Large Accounting Model, the drivers are explicit. Every transaction carries its economic dimensions — market size, share, price, cost, capital structure — each with its own reliability. Manipulation that inflates revenue must inflate it in a specific dimension. That inflation will show as an anomaly: a market share increase with no corresponding Belonging improvement, or a price increase with no Status justification. The decomposition makes manipulation structurally visible.

9.3 Knowledge Ledger Prevents Credential Fraud

Fake degrees, inflated CVs, fabricated qualifications — all depend on the ability to claim knowledge without verification. The co-signed knowledge ledger eliminates this. Competence is not self-declared. It is co-signed by the tutor who assessed it, with reliability built from multiple assessments over time. You cannot fake a ledger entry that requires a counterparty's signature.

Corruption is a prediction error in the Values channel. It is the gap between the stated value and the actual value of a transaction. The Large Accounting Model makes this gap visible, signed, decomposed, and reliability-weighted. Corruption does not survive transparency. It cannot survive co-signature. It is structurally eliminated by an architecture that records reality once, proved by all parties, with every dimension exposed.

10. A Model of the World

The Large Accounting Model, fully deployed, is a real-time, transparent, reliability-weighted model of the world's economic and intellectual activity.

Every transaction — monetary and knowledge — is recorded as a Period Entry with its economic lifespan and explicit drivers. Every transaction is co-signed by all parties. Every driver carries a reliability score built from volatility, age, sample size, and trend. Every company's Return is decomposable into its five channels. Every person's Competence is decomposable into its learning history. Beta and beta reliability emerge from the channel structure. Forecasts extend naturally from transaction end dates.

This is not a statistical model estimated from samples. It is not a black box trained on historical data. It is the actual economy, recorded at the level of individual transactions, with the full dimensionality preserved. When the model says "this sector's Resource channel reliability is falling," it is not a prediction from a neural network. It is a direct computation from the co-signed transactions of every company in that sector.

10.1 Real-Time Economic Prediction Errors

Aggregate the Period Entries across sectors, regions, and the whole economy. The difference between expected values (from the forward view of the Period Entries) and actual values (from the co-signed outcomes) produces real-time economic prediction errors. Rising negative prediction errors signal contraction. Rising positive prediction errors signal expansion. Not quarterly GDP reports delayed by months — real-time signals from actual transactions.

10.2 Policy Response

Central banks currently make policy with stale data. GDP is reported quarterly, one to two months delayed. Unemployment is monthly. Inflation is monthly. The Large Accounting Model replaces lagging indicators with real-time prediction errors decomposed by channel. A central bank can see that Resource channel prediction errors are turning negative across the manufacturing sector this week — not that GDP fell last quarter.

11. A Stock and Futures Market of Everything

Today, only a small fraction of companies are publicly traded. Of the millions of businesses operating in any economy, only a few thousand have their equity listed on an exchange. The rest — the vast majority of economic activity — are private, opaque, and illiquid. You cannot buy a share in your local bakery. You cannot trade futures on a regional logistics company. You cannot hedge your exposure to a supplier's performance.

The reason is information. Public markets require disclosure: audited accounts, quarterly reports, material event notifications, prospectuses. This disclosure is expensive to produce, expensive to audit, and expensive to regulate. Only companies large enough to bear these costs can list. The information barrier creates a two-tier economy: a small, liquid, transparent public market and a vast, illiquid, opaque private one.

11.1 The Information Barrier Disappears

The Large Accounting Model eliminates this barrier. Every company on the system already has:

This is more information, more reliably sourced, more transparently decomposed, than any listed company currently provides. The disclosure that makes public markets possible is not an expensive add-on in the Large Accounting Model. It is a free byproduct of how transactions are recorded.

If every company already has audited, real-time, decomposed, reliability-weighted financial and knowledge statements, there is no reason to restrict trading to a handful of large companies. Every company can be traded. Every company can have a market price. Every company can issue futures. The information barrier that created the distinction between public and private markets is gone.

11.2 Equity in Everything

A bakery with three years of co-signed Period Entries has a transparent revenue history, known customer relationships (Belonging), proven product quality (Status), visible cost structure (Resource), and measurable workforce capability (knowledge ledger). An investor can inspect the decomposed drivers, assess the reliability of each, examine the beta, and make a valuation — all from the ledger. No prospectus required. No auditor required. The ledger is the audit.

This means equity in any business can be issued and traded. Not as a speculative token but as a share in an enterprise whose complete economic reality is visible on the ledger. Small businesses gain access to capital markets. Investors gain access to the full economy, not just the listed fraction. Liquidity extends from the few thousand to the many millions.

11.3 Futures on Everything

Futures contracts require two things: a reliable measure of current value and a credible basis for forecasting future value. The Large Accounting Model provides both. The current value is derived from co-signed Period Entries. The future value is projected from the forward view of those same entries — the uncommitted periods, the driver trends, the reliability trajectories.

This enables futures on any measurable economic dimension:

Futures Contract Based On
Company revenue futures Forward Period Entries and driver trends for any company
Sector Competence futures Aggregated knowledge ledger trends across a sector's workforce
Supply chain futures Co-signed transaction patterns between linked companies
Regional economic futures Aggregated prediction errors across a geographic area
Knowledge growth futures Spenny-denominated Competence accumulation rates

A farmer can hedge against a regional economic downturn. A tutor can take a position on rising demand for AI education. A supplier can hedge exposure to a customer's credit risk — using the customer's own decomposed beta, visible on the ledger, rather than an opaque agency rating.

11.4 The End of Information Privilege

In current markets, information privilege creates structural advantage. Insiders know more than outsiders. Analysts with access know more than retail investors. Large institutions know more than small ones. This asymmetry is not a market failure — it is the market's defining feature. Price discovery is the process of information gradually becoming public.

In the Large Accounting Model, information is symmetric by construction. Every transaction is co-signed and visible to authorised parties. Every driver is decomposed. Every reliability is computed. The information that currently takes quarters to emerge through earnings calls and analyst reports is available in real time, to everyone, from the ledger.

A stock and futures market of everything. Not because regulation was relaxed. Not because risk was ignored. But because the information that makes markets possible — transparent, decomposed, reliability-weighted, co-signed economic data — is a natural output of the Large Accounting Model. Every company is listable. Every dimension is tradeable. Every participant has the same information. This is what efficient markets were always supposed to be.

12. Running the Economy Hotter: Monetary Policy with Accurate Information

For forty years, central banks have operated under a simple constraint: they cannot see the economy clearly enough to manage it precisely. Monetary policy is made with lagging, aggregated, sampled data — quarterly GDP, monthly unemployment, monthly inflation — and the inevitable result is overshoot. Because central banks cannot see what is happening in real time, they must act preemptively. And preemptive action with poor information means systematic caution. The cost of that caution has been borne almost entirely by workers.

12.1 The Monetarist Pattern

The pattern has repeated for four decades. The economy grows. Employment rises. The labour market tightens. Workers, for the first time in the cycle, gain bargaining power. Wages begin to rise. And at precisely this moment — the moment when growth is finally reaching workers — central banks raise interest rates to cool the economy.

The stated reason is inflation control. The actual mechanism is labour market suppression. When central banks see wages rising, they interpret it as an inflation signal and tighten monetary policy. Growth slows. Unemployment rises. Workers lose their bargaining power. Wages stagnate. The cycle resets.

Central banks deliberately cool the economy when labour markets tighten. This is not a conspiracy. It is the rational response to operating with poor information. If you cannot distinguish wage growth driven by genuine productivity improvement from wage growth driven by inflationary pressure, you must treat all wage growth as inflationary. The result is forty years of suppressed real wages, rising inequality, and an economy that systematically transfers the gains of growth from labour to capital.

12.2 Why Poor Information Forces Caution

The central bank's problem is one of reliability. It sees aggregate numbers with low decomposition, high latency, and no per-channel reliability scores. When wages rise, the central bank does not know:

Without this decomposition, the central bank must treat all wage growth identically. And because the cost of allowing inflation to run is perceived as greater than the cost of suppressing growth, the default is to tighten. Every time. Workers pay the price of informational poverty.

12.3 The Large Accounting Model Changes the Calculation

The Large Accounting Model gives central banks what they have never had: real-time, decomposed, reliability-weighted economic data from actual co-signed transactions.

Current Data Large Accounting Model Data
Aggregate wage growth (monthly, delayed) Per-sector, per-region wage growth in real time, decomposed by driver
Unemployment rate (monthly, sampled) Real-time employment transactions, co-signed, with forward visibility from contract end dates
CPI inflation (monthly, basket-based) Real-time price changes across every co-signed transaction, decomposed by market size, share, cost
GDP (quarterly, 1–2 month delay) Real-time aggregate prediction errors across all Period Entries
No knowledge data Competence accumulation rates across the workforce — a leading indicator of productivity

With this information, the central bank can distinguish between inflationary wage growth and productive wage growth. If wages are rising in a sector where Competence is also rising — where the knowledge ledger shows genuine skill accumulation — that wage growth is productivity-driven. It is not inflationary. It does not need to be suppressed. The economy can run hotter in that sector without losing control.

12.4 Running Hotter, Still in Control

The key insight is that accurate information reduces the need for precautionary tightening. A central bank that can see, in real time, that wage growth is decomposed across channels — that it is driven by genuine Competence improvement in some sectors and by labour scarcity in others — can respond with precision rather than blunt force.

Sectors with productivity-driven wage growth: let them run. The growth is real. The wages are earned. The economy benefits.

Sectors with inflationary pressure: targeted intervention. Not economy-wide rate rises that punish every worker, but sector-specific or region-specific measures informed by decomposed data.

The reliability scores make this possible. A central bank can weight its response by the reliability of the signals it is seeing. High-reliability signals of genuine productivity growth get a light touch. Low-reliability signals in volatile sectors get closer monitoring. This is what the ECF reliability function was designed for — and it applies to monetary policy exactly as it applies to emotional regulation and financial analysis.

The economy can run hotter because the information is better. Forty years of monetarism suppressed wages because central banks could not distinguish productive growth from inflationary pressure. The Large Accounting Model provides the decomposition, the reliability weighting, and the real-time visibility to make that distinction. The result is an economy where growth reaches workers — where wages rise with productivity, where labour markets are allowed to tighten, and where central banks intervene with precision rather than panic. This is not a loosening of monetary discipline. It is the replacement of monetary bluntness with monetary intelligence.

13. Unprecedented Growth: AI Meets Transparent Economics

The world is entering the most significant technological transition since electrification. Large language models and generative AI are transforming every knowledge-intensive industry simultaneously — legal, medical, financial, educational, creative, engineering, administrative. The productivity gains are not incremental. They are transformational. But the economic system receiving this wave of technology is the same opaque, lagging, reconciliation-burdened system that has existed for decades. The result, without reform, will be the familiar pattern: enormous gains captured by capital, suppressed by cautious monetary policy, and distributed unevenly.

The Large Accounting Model changes the receiving system. The combination of tight economic information and increased business confidence creates the conditions for the AI wave to produce unprecedented economic growth over the next five to ten years.

13.1 Information Confidence Unlocks Investment

Businesses underinvest when they cannot see clearly. Uncertainty about demand, about competitor behaviour, about regulatory response, about their own cost structures — all of these suppress capital expenditure and hiring. The standard response to uncertainty is caution: hold cash, delay projects, wait for more data.

The Large Accounting Model eliminates the primary sources of business uncertainty. Demand is visible in real time from co-signed transactions. Competitor behaviour is decomposed into visible channel movements. Cost structures are transparent. The knowledge ledger shows whether the workforce has the Competence to execute. Forward visibility extends from Period Entry end dates. Beta and beta reliability tell the company exactly how exposed it is and how much to trust that estimate.

When businesses can see clearly, they invest. When they can see that demand is real, that their competitive position is strong, that their workforce is capable, and that these signals are reliable — they commit capital. The AI wave meets a business environment that is confident not because of sentiment but because of information.

13.2 Natural Language Software Becomes Ubiquitous

There is a second acceleration built into this transition. The software that implements the Large Accounting Model — the Period Entry engines, the co-signature protocols, the knowledge ledger, the economic decomposition — will increasingly be written by AI itself. Natural language programming means that the specification is the code. A business requirement described in English becomes a working system.

This has a structural consequence: software becomes ubiquitously open source and therefore very low cost.

When anyone can build software by describing what they want, software ceases to be a scarce good. The proprietary advantage of custom code disappears. The cost of implementation collapses. The barriers to adoption — which have historically slowed every technological transition — fall to near zero. The Large Accounting Model does not require a billion-pound infrastructure programme. It requires a specification that AI can implement, and that specification already exists in papers like this one.

The implications cascade. If Period Entry accounting software costs almost nothing to build, every small business can adopt it. If co-signature protocols are open source, every bank can implement them. If knowledge ledger systems are freely available, every tutor and every student can participate. The network grows not because a single company rolled it out but because the components are cheap enough for everyone to build their own compatible version.

13.3 The Virtuous Cycle

The combination produces a virtuous cycle that has no precedent in economic history:

  1. AI increases productivity. Natural language software automates knowledge work across every sector. Output per worker rises.
  2. The Large Accounting Model makes the productivity gains visible. Competence accumulation on the knowledge ledger shows genuine skill growth. Decomposed drivers show which sectors are producing real gains.
  3. Central banks can distinguish productive growth from inflation. The decomposed, reliability-weighted data shows that growth is real. Monetary policy stays accommodative where the data supports it.
  4. Business confidence rises because information is accurate. Companies invest because they can see the gains are real and reliable. Capital expenditure increases.
  5. Wages rise with productivity. Labour markets tighten. But because central banks can see that the tightening is productivity-driven, they do not suppress it. Workers share in the growth.
  6. Rising wages increase demand. Workers spend. Revenue grows. The economy expands.
  7. The knowledge ledger captures the learning. Every tutorial, every AI interaction, every co-signed Competence assessment accumulates. The workforce becomes measurably more capable. Productivity rises further.
  8. The cycle repeats. Each iteration produces more growth, more confidence, more investment, more learning.

This cycle has stalled in every previous technological transition because of information failure. The gains from electrification, from computing, from the internet — all were real, but central banks could not see them clearly enough to let the economy run. The precautionary tightening kicked in. Growth was suppressed. The gains went to capital.

13.4 The Scale of the Opportunity

AI is not a single-sector technology. It affects every sector simultaneously. Natural language interfaces mean that the implementation cost is near zero. The Large Accounting Model means that the gains are visible, decomposed, and reliability-weighted. Central banks can run the economy hotter because they can see what is happening. Businesses can invest with confidence because the information is accurate. Workers can share in the growth because monetary policy is not suppressing it.

The next five to ten years could produce economic growth without modern precedent. Not because the technology is unprecedented — though it is — but because for the first time, the economic infrastructure can see the growth clearly enough to let it happen. The Large Accounting Model is not just a better accounting system. It is the informational precondition for the AI age to deliver on its promise. Without transparent economics, the AI wave will be captured by capital and suppressed by central banks, as every previous wave has been. With transparent economics, the gains reach everyone.

14. Conclusion

Five hundred years ago, Luca Pacioli formalised double-entry bookkeeping. It was a system for tracking money at points in time. It required a general ledger, adjusting journals, and reconciliation at every boundary. It worked. It also created an entire profession dedicated to the mechanical process of making the numbers agree.

The Large Accounting Model replaces this with two innovations, one expansion, and one currency.

The first innovation is Period Entry: recording transactions with their economic lifespan, so that accruals, depreciation, prepayments, and deferrals are computed automatically. This eliminates the general ledger.

The second innovation is co-signature: four identified parties cryptographically signing a single shared value. This eliminates bank, customer, supplier, and intercompany reconciliation. The transaction is proved at source.

The expansion is the knowledge ledger: tracking Competence and its reliability, Collaboration and its reliability, Curiosity and its reliability, alongside the monetary value. Co-signed by tutors and students, accumulated over a lifetime, replacing the binary credential with a continuous, multi-dimensional, reliability-weighted measure of what a person knows and how well they work.

The currency is the Spenny — study penny — issued by the Bank of Learning, backed by the accumulated knowledge of the network, and circulating through every learning transaction on the system. Spennies fund development through presale to early participants, flow through tutorials and content, and appreciate as the network's co-signed Competence grows. A currency backed not by government decree or speculative demand but by measurable, auditable, reliability-weighted human knowledge.

The economic model — market size, share, price, cost, capital structure — lives inside the Period Entry as explicit, visible dimensions. Beta and beta reliability emerge from the channel structure. The model is not separate from the data. The model is the data. Forecasts extend from transaction end dates. Knowledge accumulation precedes and predicts economic growth. Symmetric information increases economic efficiency. Transparency removes corruption. Every company becomes listable. Every dimension becomes tradeable. Monetary policy becomes precise enough to let the economy run.

And the timing is not accidental. AI is collapsing the cost of software to near zero. Natural language programming makes the specification the implementation. The Large Accounting Model does not need a decade-long infrastructure programme. It needs its components described clearly enough for AI to build them — and for those components to be open source, ubiquitous, and free. The technology that makes the Large Accounting Model possible is the same technology whose economic gains the Large Accounting Model is designed to capture and distribute.

This is what accounting was always trying to be: a complete, honest, real-time record of economic reality. It just needed to track both accumulations — money and knowledge — it needed the parties who know the truth to sign it — and it needed the moment when building it became as simple as describing it.