This wikilog article is a draft, it was not published yet.

Julia programming

⇠ Back to Blog:Sandbox
m
m
 
(30 intermediate revisions by one user not shown)
Line 1: Line 1:
One can generate an array from a range <tt>1:10</tt> using <tt>collect()</tt>:
+
<center><wz tip="Still frames for an excitable medium with a=11 and g=2">[[File:Screenshot_20210315_164359.png|400px]]</wz></center>
<syntaxhighlight lang="python">
+
collect(1:10)
+
</syntaxhighlight>
+
<pre>
+
10-element Array{Int64,1}:
+
  1
+
  2
+
  3
+
  4
+
  5
+
  6
+
  7
+
  8
+
  9
+
10
+
</pre>
+
  
Let us now turn to a typical numerical problem, solving differential equations. We'll start with Ordinary Differential Equations (ODE). Namely, we consider equations of the type:
+
Here is the same but animated:
  
$$\label{eq:ODE1} y'(t)=f(t,y(t))$$
+
<center><wz tip="250 iterations of the a=11 g=2 excitable model.">[[File:excitable-medium-a2-b11.gif|400px]]</wz></center>
  
The most famous algorithm is Euler's method and the most popular the RK4 variant of the Runge–Kutta methods. In between are a variety of possible implementations, many related in some way or the other. Methods can be explicit (the solution only depends on quantities already known) or implicit (the solution involves other unknowns which requires solving intermediate equations). The backward Euler method is an implicit method. Euler's method is the simplest one and the typical starting point to introduce the basic idea. Not surprisingly, it is also one of the least efficient. We discretize time into steps $t_1$, $t_2$, $\cdots$, $t_n$ separated by the same interval $h$. The solutions at these times are denoted $y_n=y(t_n)$. Back to \eqref{eq:ODE1}, approximating $y'(t)\approx (y(t+h)-y(t))/h$, we have:
+
Playing with the parameter space:
  
$$y_{n+1}=y_n+hf(t_n,y_n)$$
+
<center><wz tip="Two (out of the many) configurations:">[[File:Screenshot_20210315_181551.png|400px]]</wz></center>
  
This is very simple since from the solution $y_n$ at time $t_n$ we can compute the solution at $t_{n+1}$. Explicitly.
+
And the bottom case, animated:
  
Let us solve, for instance, $y'=-y$, whose solution is the [[exponential]] function.
+
<center><wz tip="250 iterations of the a=10 g=20 excitable model with a lot of initially infected cells in various stages, preventing extinction.">[[File:excitable-medium-a10-b20.gif|400px]]</wz></center>
  
<syntaxhighlight lang="python">
+
It is interesting to focus on the surviving spot. We can zoom on this area and see what is going on there. The cells have locked into this stable pattern, involving, apparently, two cells only:
npts = 100; # number of points
+
y=0.0*collect(1:npts); # predefine the array
+
h=.1; # timesteps, so total time = npts*h=10
+
# The differential equation to solve
+
function f(t,y)
+
    -y
+
end
+
y[1]=1; # initial condition
+
# The numerical method (Euler's)
+
for i=1:npts-1
+
    y[i+1]=y[i]+h*f((i-1)*h,y[i])
+
end
+
</syntaxhighlight>
+
 
+
And we can now compare the numerical solution to the real one:
+
  
 
<syntaxhighlight lang="python">
 
<syntaxhighlight lang="python">
plot(((1:npts).-1)*h, [y[i] for i=1:npts], label="Euler", xlabel="t", ylabel="y'=-y")
+
excited = @animate for i 1:30
plot!(x->exp(-x),0, 10, label="Exact")
+
          excite()
 +
          plotspace(145, 220, 225, 300)
 +
      end
 
</syntaxhighlight>
 
</syntaxhighlight>
  
<center><wz tip="Euler's method vs exact solution. The right one had the further option yscale=:log10">[[File:Screenshot_04-03-2020_083840.jpg|600px]]</wz></center>
+
<center><wz tip="A stable 2-cells pattern that survives in the g>a configuration by locking two neighbours with a period a+b=30.">[[File:stable-2cells-pattern.gif|400px]]</wz></center>
  
Let us try the more difficult example:
+
It is interesting as well to change parameters during the simulation. This locks the patterns into some dislocated sequences which give rise to apparent cycles of stationary-evolution in space, which, however, are due to structures that formed under previous conditions of evolution. For instance, the following was obtained by iterating 50 iterations with parameters <tt>a=50 and g=25</tt> then changing <tt>g=5</tt> for 50 iterations and coming back to the initial <tt>g=25</tt> and cycling for ever:
  
$$y'={y\over 2}+2\sin(3t)\label{eq:PaulODE}$$
+
<center><wz tip="Strange oscillations in the wake of changing parameters in mid-flight. This is an interesting optical effect due to dephasing of the wavefronts caused by dislocations in the structure following various types of evolutions.">[[File:strange-oscillations.gif|400px]]</wz></center>
  
whose solution can be found with an integrating factor [http://tutorial.math.lamar.edu/Classes/DE/Linear.aspx]
+
There clearly appears to have two types of oscillations, one seemingly stationary in space whereas the over travels as wavefronts, that get periodically frozen. This is, however, merely an optical artifact, that can be well understood by looking at the state of the cell's evolution in time. Here is such a cut obtained in a similar configuration as the one displayed as a full density plot:
 
+
$$y=-{24\over37}\cos(3t)-{4\over37}\sin(3t)+\left(y(0)+{24\over 37}\right)e^{t/2}$$
+
 
+
In this case the differential equation is defined as:
+
  
 
<syntaxhighlight lang="python">
 
<syntaxhighlight lang="python">
function f(t,y)
+
for i=1:75
    (y/2)+2sin(3t)
+
  println(global iteration+=1); excite();
 +
  Plots.display(plot(space[[100],:]', ylims=(0,a+g), lw=2, legend=false))
 
end
 
end
 
</syntaxhighlight>
 
</syntaxhighlight>
  
The [[Volterra-Lotka model]] reads:
+
<center><wz tip="Slice through the cellular automaton showing two types of oscillations, as a result of dislocations: drifting and collapsing.">[[File:wavefronts-excitable-medium-cut.gif|400px]]</wz></center>
 
+
Euler's method in this case reads:
+
 
+
$$\begin{align}
+
y_{1,n+1}&=y_{1,n}+hf_1(t_n,y_{1,n},y_{2,n})\\
+
y_{2,n+1}&=y_{2,n}+hf_2(t_n,y_{1,n},y_{2,n})
+
\end{align}$$
+
 
+
The functions are implemented as:
+
 
+
<syntaxhighlight lang="python">
+
function f1(t,y1, y2)
+
    α*y1-β*y1*y2
+
end
+
function f2(t,y1, y2)
+
    δ*y1*y2-γ*y2
+
end
+
</syntaxhighlight>
+
 
+
<syntaxhighlight lang="python">
+
npts=10000
+
h=.001
+
 
+
y1=0.0*collect(1:npts);
+
y2=0.0*collect(1:npts);
+
 
+
y1[1]=1;
+
y2[1]=.1;
+
 
+
α=2/3;
+
β=4/3;
+
γ=1;
+
δ=1;
+
 
+
@time for i=1:npts-1
+
    y1[i+1]=y1[i]+h*f1((i-1)*h,y1[i],y2[i])
+
    y2[i+1]=y2[i]+h*f2((i-1)*h,y1[i],y2[i])
+
    end
+
</syntaxhighlight>
+
 
+
<syntaxhighlight lang="python">
+
plot([[y1[i] for i=1:npts], [y2[i] for i=1:npts]])
+
</syntaxhighlight>
+
 
+
<syntaxhighlight lang="python">
+
plot([(y1[i], y2[i]) for i=1:npts])
+
</syntaxhighlight>
+
 
+
Euler's method is not very accurate. It can be improved by considering a better estimation for the tangent. This is known as Heun's method:
+
 
+
$$y_{n+1} = y_n + {h\over2} (f(t_n, y_n) + f(t_n + h, y_n +  h f(t_n, y_n)))$$
+
 
+
Let us apply it to our differential equation \eqref{eq:PaulODE}:
+
 
+
<syntaxhighlight lang="python">
+
 
+
</syntaxhighlight>
+
 
+
Heun's Method is one of the simplest of a class of methods called predictor-corrector algorithms (Euler's method is only a predictor method).
+
 
+
We now cover one of the most powerful predictor-corrector algorithms of all,
+
one which is so accurate that most computer algorithms use it as default: the fourth-order Runge-Kutta Method. RK methods are optimized predictor-corrector methods, so they push the idea of Heun's method to the top.
+
  
http://tutorial.math.lamar.edu/Classes/DE/Bernoulli.aspx
+
One can see how the dislocations cause this alternances of drifting or collapsing evolution of the cell states.
  
http://calculuslab.deltacollege.edu/ODE/7-C-3/7-C-3-h.html
+
Clearly there are many variations and configurations to explore, this is barely scratching the surface. Instead of delving further into this model, we turn to the most famous and also more impressive Hodgepodge machine.
  
Backward Euler method, or Implicit Euler method,
+
{{WLP6}}

Latest revision as of 21:51, 15 March 2021

Screenshot 20210315 164359.png

Here is the same but animated:

Excitable-medium-a2-b11.gif

Playing with the parameter space:

Screenshot 20210315 181551.png

And the bottom case, animated:

Excitable-medium-a10-b20.gif

It is interesting to focus on the surviving spot. We can zoom on this area and see what is going on there. The cells have locked into this stable pattern, involving, apparently, two cells only:

excited = @animate for i ∈ 1:30
           excite()
           plotspace(145, 220, 225, 300)
       end
a configuration by locking two neighbours with a period a+b=30.">Stable-2cells-pattern.gif

It is interesting as well to change parameters during the simulation. This locks the patterns into some dislocated sequences which give rise to apparent cycles of stationary-evolution in space, which, however, are due to structures that formed under previous conditions of evolution. For instance, the following was obtained by iterating 50 iterations with parameters a=50 and g=25 then changing g=5 for 50 iterations and coming back to the initial g=25 and cycling for ever:

Strange-oscillations.gif

There clearly appears to have two types of oscillations, one seemingly stationary in space whereas the over travels as wavefronts, that get periodically frozen. This is, however, merely an optical artifact, that can be well understood by looking at the state of the cell's evolution in time. Here is such a cut obtained in a similar configuration as the one displayed as a full density plot:

for i=1:75
   println(global iteration+=1); excite();
   Plots.display(plot(space[[100],:]', ylims=(0,a+g), lw=2, legend=false))
end
Wavefronts-excitable-medium-cut.gif

One can see how the dislocations cause this alternances of drifting or collapsing evolution of the cell states.

Clearly there are many variations and configurations to explore, this is barely scratching the surface. Instead of delving further into this model, we turn to the most famous and also more impressive Hodgepodge machine.