I have been thinking a bit on two system approaches to handling messiness: one that is oriented towards mess ‘prevention’, and another that is more tolerant of messiness but instead prioritizes sense-making. Of course, it is possible to spend effort on both fronts when designing and implementing a system. But sometimes a compromise has to be made and one approach has to prevail on the other.
This thought process was seeded through recent experiences at work and also by points raised in online discussions that I follow. After boiling down the issues to what I perceived is a root cause of many misunderstandings, I realized just how much I prioritize sense-making over mess-prevention. Don’t get me wrong, I really value mess prevention and a good system or process design should set well-defined boundaries. However, if I could think of an efficient way to make sense of existing or future mess, I would gradually lose ‘faith’ in the need for mess-prevention efforts.
For example, my preference for sense-making leads to heightened interest towards natural language processing instead of required semantic mark-ups. I admit that I routinely use structured data formats, such as found in JSON/XML specifications or implied in relational database systems, so I don’t think I under-appreciate the importance of mess-prevention. However, where there are no systems or specifications in place, I am more open to considering lightweight systems with minimal requirements. Similarly, I like dynamically-typed programming languages and search engines that attempt to make sense of whatever and however I write.
There is just too much to lose when requirements are over-specified too early. In a research environment such as where I work, I think the tools chosen should be more naturally inclined towards sense-making rather than mess prevention. Enforcing manual code versioning steps is simply inappropriate especially when there is an opportunity to automate routine commands, such as creating hooks in git or mercurial. Time spent complying with strict standards would be better spent conducting actual research. A tool that has a good balance of tolerance for messiness and built-in capabilities for sense-making should be preferred over another that requires strict enforcement.
It’s sad that I have not been given an opportunity to explain all of the above where I work. But at least I have my personal projects where I could express the approach that I prefer and blogs to demonstrate what I mean.
One way to look at ‘coupling’ is the relationship that a transaction record implies between transactors. For example, the use of community currency implies membership in the same community or in communities that have an agreement in place to accept each other’s currency. In Ripple, direct transactions occur only if the transactors have preset limits with each other and payments have to be routed through pre-established accounts. These requirements may be viewed as ‘static binding’ or predetermined configurations to set limits on who could trade with whom.
In the case of mutual credit community currencies, it is not hard to see similarities with the pre-WWW philosophy that hyperlinks are to be made and maintained through a centralized database. A requirement that transactors must have an account with the same mutual credit accounting system highlights this similarity. But the similarity extends beyond considerations of ledger boundaries and into logical or abstract rules for initializing who could trade with whom. As long as the ability to transact requires a predetermined relationship, the ‘localized hyperlinks’ mindset applies even if the accounts were maintained in separate servers or accounting systems.
Under tyaga.org’s implementation scenarios, a transaction does not automatically imply that the transactors are members of the same community or that they have established credit limits with each other prior to the transaction. Anyone could potentially trade with anyone else. It is up to a potential recipient to accept or reject a currency brand at the time of transaction – essentially, the concept of dynamic or late binding as applied to a currency system design. This is similar to the WWW philosophy of allowing any web page to link to another page at-large, which is a less-controlled way of doing things but inherently more flexible and scalable. Even though mutual consent is required for an inter-entity transaction, such consent is only applicable to one instance of payment and does not imply past or future guarantees of currency brand acceptability between two brands.
Loose coupling, as described in a previous post, is expected to lower the barrier of setting up new currency brands, leading to more spontaneous currency brand creation (i.e., more ‘new web sites’ in the current analogy). Dynamic binding, as described in this post, is expected to lead to more diverse market selections and higher frequencies of inter-entity transactions (i.e., more ‘interdomain links’.)
As with all design trade-offs, however, there is a price to pay for such high expectations. Tyaga.org’s design approach for achieving flexibility and scalability comes at a substantial cost of stricter reporting requirements and greater dependence on service providers to make make sense of huge volumes of transaction data. The challenges of reconcilable reports, auditors and currency brand indices arise since each entity is allowed and encouraged to set its own currency limits as budgets, without having to predetermine transaction boundaries or specific entities as revenue sources.
In assessing my project plans for this year, I reviewed the core requirements that the implementation is trying to address. In an effort to simplify the core requirements even farther than the one-page ‘game’ representation, I have arrived at the following three main concepts:
1. An independent currency brand corresponds to, and is issued by, an entity with a self-determined mission to provide certain goods and services to the market or general public.
2. Independent currency issuance is defined as an equivalent increase in the unused revenue and expense budgets of an entity.
3. Published reports are necessary to audit the whole currency lifecyle, including the corresponding inflows and outflows between entities.
Among these main concepts, the definition of currency issuance might seem the most arbitrary to others. I now realize that this same definition also implies the other currency activity definitions in the ocaup accounting model. So instead of having to explain and defend the whole ocaup model (which really is not that complicated), I really only have to explain why #2 is so important to the design of accounting systems and payment protocols that affect inter-entity transactions.
First, it must be noted that all currency design require an accounting restriction of some sort. Community currencies impose geographic or shared-interest boundaries on where currencies might circulate. Ripple requires payments to be routed through a pathway of neighboring nodes. Traditional currencies impose restrictions on who could issue fiat notes into general circulation. So having a set of accounting restriction to guide the issuance and use of currency is nothing new and all are likely to be viewed as arbitrary. But why insist on #2?
The short answer is that #2 leads to looser coupling between currency brands. Loose coupling facilitates the ability to noncooperate with a particular entity by not accepting its currency brand.
For example, imagine a politician with a campaign budget that is funded by donations. If the politician’s ability to raise her campaign budget is tied to the ongoing donations that she receives, then she is more likely to care about satisfying the special interest of big donors such as lobbyists. In contrast, if she is able to fund her campaign budgets independently of donations that come in within a given period, then there is less pressure to attract or retain large donors. She would worry more about the acceptability of her currency brand to market participants and the general public. Because of #2, her currency activity is tied to the self-determined limits that her organization has set to conduct its campaign.
The other implication of #2 is that inter-entity currency flow is allowed as long as the payments do not lead to an increase in the unused budgets. In fact, considering that each entity fulfills a specialized role, #1 and #2 acknowledge that inter-entity transactions are to be expected and supported. Loose coupling does not have to lead to isolated currency systems.
When each market entity decides to issue its own currency as unused budgets, it would be impossible and counterproductive to predict which currency brand is going to be offered as payment for a transaction at any given time. Even if each person carries only one or two currency brands, a seller is faced with the prospect of receiving payments in many different currency brands from different customers (employees of google.com, seattle.gov, etc.) Clearly, an inter-entity payment protocol must factor such currency brand diversity in implementation use-case scenarios. Real-time advisories on currency brands would be the best approach since revenue sources would not be unnecessarily restricted through pre-emptive brand rejection based on potentially stale information.
To summarize, currency traceability to an entity, loose coupling between currency brands, and auditable reports of currency activity are important design goals that facilitate informed cooperation and nooncooperation with specific entities.
Now that I have finished concentrating on other projects and commitments, I look forward to continuing tyaga.org’s development work. One of the web sites that I visited recently was the community way (CW) in Comox Valley. The information design aspect that caught my interest was the ‘current numbers’ page with links to a graph and spreadsheets.
What follows is not a critique of the reporting system design as used by CW – for all I know, that design serves CW’s needs perfectly. I also do not question the community emphasis of the CW currency system and I sincerely would like to see such a system succeed wherever it is implemented. My goal in offering the following comparative analysis is to better explain the technical nuances behind tyaga’s evolving IS design requirements.
It has become obvious to me that the use of offline transaction instruments, such as minted notes, checks, or store-and-forward devices, could not be tracked efficiently and would not be conducive to the development of dynamic currency brand reporting systems. So while my earlier design notes referred to the importance of offline devices, I have since revised the technology requirements to focus on online devices. The most promising device in this regard is a basic cell phone with SMS capability, which is already widely deployed and inexpensive to own. While a QR-code app is not required to post transactions through SMS, a camera phone with that capability would simplify data entry and transfer between transactors.
Another design change, as described in a recent post, is the use of public keys for better non-repudiation and auditing capabilities. The example index on this site clearly illustrates the required tracking and metrics at the level of an entity, as represented by its currency brand, and not simply aggregated for a whole community as shown in CW’s current numbers graph. I am not sure of CW’s requirements for auditability, but tyaga’s design requirements include the auditability of any published performance and evaluation information for each entity.
Finally, it is clear that an under-developed reporting system does not hinder CW’s implementation. In contrast, a robust and dynamic reporting system is required to implement tyaga’s concept of spontaneous, targeted non-cooperation against specific currency brands. A currency brand index, constructed with dynamic information from reporting systems, should help participants make informed decisions on whether to accept or reject a currency brand in a transaction.
Again, I applaud CW as one of the rare, actual, working implementations of alternative trading systems. At the same time, I am reminded that there are not many development or implementation efforts that seek to address tyaga’s main information systems requirements.
For the rest of the year, I will investigate a substantially different approach to PaCT. I have learned important lessons while working on Prowl and PaCT this year, which has prompted a series of changes since early this summer.
The main lesson has to do with the importance of non-repudiation in a payment protocol such as PaCT. I have been trying to avoid designing PaCT around asymmetric encryption, with the idea that anything involving public key distribution, verification and revocation would lead to too much system complexity. Unfortunately, the ‘independent-witness-on-demand’ idea produces its own set of complexities while not giving as strong a sense of non-repudiation as a digital signature from each entity. In addition, it is more appropriate to move the concern of transparency to the separate stage of periodic report publication and audits.
I plan to revise code and documentation to reflect a refined strategy to be built around a core manifest or declaration document. Each entity with its own currency brand publishes a url to its manifest. The manifest will contain three main elements: certificate, accountant and report.
- Certificate element: describes an entity’s public key and a list of certificate urls (x.509, pgp or some other format) for verification purposes. As with any pki or web of trust schemes, a seller must trust the issuer or endorser of the certificate and support the representation format used.
- Accountant element: describes an entity assigned url for submitting transaction records. A designated accountant will be able to produce or verify digital signatures on an entity’s behalf. There may be a list of urls when different accountants are used for different currency units and transaction processing protocols.
- Report element: describes the corresponding urls to a list of audited and pending reports. Child elements will include transaction period, currency unit, auditor, content-type, etc.
More details to follow in upcoming posts.
I have recently come across Postel’s Law, which is typically quoted as: “Be conservative in what you do; be liberal in what you accept from others.”
This is also known as the Robustness Principle. It reflects tyaga’s vision of how independent currency brands should operate and interact: “Be strict in setting your own limits and performance, be tolerant in accepting other currency brand’s limits and performance.”
In other words, in order to promote the adoption and spread of independent currency brands, it is important to expect that most entities will likely have poor performance relative to its initial budgets. The important thing is to observe work with dedication and perserverance, and not to expect immediate success. A dynamic index is therefore a means for evaluating the progress of an entity through its currency brand, and not a means for avoiding transactions that could help another entity reach its goals. Something to think about when studying and promoting the use of dynamic indices.
Would it be possible to reconcile the fundamental differences between the following two approaches:
“Currency designed for transactions between members of independent entities or brands”
“Currency designed for transactions between members of the same entity or community”
My doubts have resurfaced after reading some recent online discussions related to currency information systems. At issue is the importance of interoperability between different currency entities as supported by adhoc service providers.
With community-oriented currencies, the expectation is for the majority of transactions to occur between members of the same currency entity, so the administrative system of one currency does not have to worry about understanding information from another currency system. This obviously has the advantage of freeing each currency issuer to configure their currency and transaction grammar as needed, without worrying about what other issuers are doing.
In contrast, tyaga.org expects that separation of concerns and scalability would inevitably lead to a majority of transactions occuring between members of different currency entities. Where each entity specializes in serving a particular market need, access to product diversity is only possible through interentity trades. Each entity could still configure currency settings and use proprietary messaging/record formats. However, the need for publishing and reporting conventions is going to be unavoidable under a scenario of global interentity transactions.
It was interoperability concerns that led to the development of Prowl as a potential starting point for discussing uniform representation and standardized accounting terms, while still allowing for variations in parameters and calculation specifications. Without interoperability, it would not be possible to achieve traceable and auditable currency brands.