constant = 0.4 # if > 0.5 tends not to halt. n = 1 length = 1 maximum = 1 while n: if random.random() < constant: n += 1 value += 1 else: n -= 1 length += 1 maximum = max(maximum,n)
Both length and maximum appear to follow power law distributions, power dependant on constant.
This is nice because:
- It's a plausible model of the best-fit synthesis patch size stuff, which I am guessing has some similarity to echolalia and memory.
- It's a simple and somewhat plausible model of a thought sequence evolving over time -- the thought activating and deactivating processing units in a pesudo-random manner over time, a-la in a randomly connected network.
- It indicates that durations as well as magnitudes of response may yield useful measurements.
So what I am thinking is: look at sentence length and paragraph length in autistic vs normal writing. This data should be easier to get hold of. [though note that if the process contains multiple events occurring in parallel, such as an activation pattern in a randomly-connected network, the duration may not equal the number of iterations]
... a slight extension: the constant could be a function of n, mostly constant but smaller for n close to zero, and with some sort of saturation effect for large n. ... this also suggests a way to model learned helplessness, and ways to go about curing it.