Base Complexity is a new proposed measurement to account for the total complexity of a given application, framework, or graph of connections. In contrast to Big O notation, this is examining the complexity of state mutations of an application across a series of steps. It is interesting to not that in regards to the default complexity of all applications to this point is just some Base 1. Where Base 0 would be the block scope mutation of some variable.
Whereby there is a function or method that modifies some state by way of a single time sequence. Game AI have traditionally been some Base C 1-2, as this accounts the number of actions that are accounted for in some sequence. Where any string of actions greater than 2, would introduce side effects into other systems.
As the true difficulty of this notation, is that it highlights the exponential complexity of an application per additional step involved in a sequence. That if there is some grand strategy that is engaged in a system that involves a specific value to remain locked. Eases the difficulty of devising these sequences.
As an analog to daily life would be the ride share. That you and some partner or friend might share the same car. That your strategy for the day directly involves the location of that car within a network and if you are the one who is picked up. Your strategy for that day must likewise account for the possibility of that car being in some accident or late. In which there would be a new set of strategies engaged by both parties. In order to shift back to the original optimum routine. This Complexity is some Base C 2 + N, for the Scope of getting to work then out. The Complexity of any additional strategies given a car being in an accident would is that N, as in discovery and unknown due to not knowing the degree of the accident itself.
But let's say that the accident totals the car and there is no passenger harm. For the one who is late, the strategy is Base 1 for ride share. But the shared strategy is some Base C 3 for the scope of managing the original strategy, rental, and eventual new ownership of a car. But the purchase of the car itself is its own complexity that is entirely dependent upon the specifics of this situation. So it would be some Base C 3 + N. In the most ideal there is no accounting for credit check, you have full funds to place a down payment, and there is little decision making in the process. So the the total complexity is just Base 3 N where N = 1 or Base 3.
Where this methodology would be most important in computer science. Would be within cloud networking, or the movement of data across a series of edge points to be closest to the user. Where the main difficulty presented in the cloud is the update to the user's data, in combination with maintaining an updated edge point for that user. But even when we examine this complexity, it is still just some Base C 2 + 2. Whereby you move the data to the edge, render such on their screen, then update the record from origin, that then cascades that data to edge. Where the complexity begins is after the data is transferred to the user. As up to this point, all data mutations are entirely within a Base C 2, but accounts for the moment the user connect likewise adds a factor of 2 to enable interactivity with that user's data. Thus Base C 4. This demonstrates how Base Complexity highlights that potential of a value being out of date or any data races depending on your edge update strategy.
This is important to recognize as the majority of what we do has been limited to a baseline complexity of 1. That we move some data, to a user, and the user updates some table on your server. That the consequence of allowing multiple participants to be factored into a distributed system to edit a series of data in some tables. Should account for the total increase of steps (especially off network) and not just the exponential nature of a single step.
The true reason for this introduction is likewise a measure in which we can judge some code that an AI might create. And whether such is capable of halting, as the unfortunate side effect of potential data races in a complex environment. Is the potential of passing the same value back and forth stuck in an infinite loop. As this can likewise be pushed into a broader measurement of any models ability to reason within some Base Complexity. While highlighting that the standard complexity of what we engage in is just some Base C 1, but take for granted the difficulty of managing a ride share.
Top comments (0)