Building A Transformer AI From Scratch - Part 6 learning rates

Published on 13 February 2026 at 23:04

“The question is not whether intelligent machines can have any emotions. But whether machines can be intelligent without emotions,” Marvin Minsky

 

If you think about what Marvin Minsky itself that would probably be an effective answer to the nature versus nurture debate if we solved that.

 

Introduction

 

This is my blog about what I am learning each week.

 

I built a transformer AI using another AI to vibe code it. I found the code it produced was workable; compiled in C++, it had some heavy optimisation issues that made it slow, usually because it tended to include unnecessary features, especially when AI was deployed to work in a team, i.e. analysing each other's code. I would call this me trying to stress test agentic AI in a way I can see how it was working, and much of the code that I was most dissatisfied with was an artefact of this interaction.

 

 I found that this caused issues. However, the bottom line is that the built error worked and went down.

 

I am now using that codebase to work through optimisation testing, running A/B tests along the way to learn what makes or breaks the transformer design.

 

I have included an Appendix with test data used to build the benchmark. It may not mean much outside my model, but it is there.

 

Learning Rates And Other Small Things Matter

 

These are early days for me. I learn best by making mistakes in a controlled, safe environment, as quickly as possible, to avoid appearing inexperienced in high-pressure situations.

 

Therefore, I have been running many permutations of the core code for 100-100 epochs in small toy simulations to see what works and what breaks. 

 

I use the following sentences and code that creates a word encoding, cycles through the sentence, and then shuffles the sentence. A model that trains on specific, ordered data might develop saddle points that artificially boost its performance and/or end up stuck in a suboptimal training state. 

 

"The cat sat on the mat"

"The dog sat on the log"

"A bird flew over the nest"

"A plane flew over the city"

"The sun shines in the sky",

"the moon glows at night",

cats and dogs are pets",

"Birds and planes fly"

 

This rule can be explained simply: if I reward you 2 points and then penalise you 2 points for being wrong at the same point and time every time, you might end up with those two values cancelling each other out, so randomising the order avoids this. 



The “Values” are just a calculation of the distance between the target correct output and the output in teacher training. I was not using cross-entropy loss; I was just using a total distance to get a sense of the numbers (this may have been a mistake; see below).  

 

I found that, depending on how you set the learning rate, you could get wildly different results.

 

The volatile ones that cluster around 26 are because they overshoot the optimal value for a given weight, oscillating. The optimal learning rate is between 3-5e and 5-5e, but I am still dialling in my sense of this, and my first attempts used 0.1, 0.01, and so on, so they were too high. I really messed this up. I should read the 2015 paper, “Cyclical Learning Rates for training neural networks,” which suggests I was using a learning rate that was way too high.

 

The best solutions seem to use scheduled learning rates, starting with a higher rate and then dropping it to capture finer nuance. i.e. run 10 or so epochs at 5-5e and then drop to  3-5e.

 

You can see why this process is a bit hard. I can see from my data that a mistake can increase your loss, and I have been dialling this part in, looking at which parts of the maths cause this error to rise. 

 

Why is there no great heuristic for perfect

 

When I started this, I tried what I thought was the standard learning measure: cross-entropy. 

 

Though the code I got from vibe coding was this; loss -= std::log(d_out(i, target[i]) + 1e-10f);

 

Where it is the log value of the degree of error (think gap between what was predicted and right) as received at the correct target as a log value.

 

The code supplied when vibe coding is incorrect. The reason is that it used the matrix d_out, which in my code represents the error at the final endpoint in the forward pass.

 

The thing for me was that such a method would reward an AI as long as it was increasing in certainty on this output sight, but would not tell you anything about whether there were other outputs in the softmax that were larger. i.e., an AI assessed in this manner could score higher yet still be wrong. 

 

I tried just summing loss  ( d_out(i, j)- indicator) * scale;

 

This is what i got it causes the error to go rapidly down then back up.

The new loss calculation created a little bit of a mess. It lost me a week while I investigated this dynamic. What I concluded was happening was that, at the start, it starts reducing the loss on unnecessary outputs (it has 100 using only some), then it starts nearly always predicting <EOS> because it has many sentences; they all end with <EOS> (meaning end of sentence, i.e., the finish point). After all, this is the most common token, so the AI defaults to it. Though at this bottom point, it actually might start predicting “the”, i.e. it no longer wants just to predict the token it tries and starts guessing a “right next token”. 

 

It starts routinely outputting something like this. Usually it obseses over outputting “the” “cats and “-the, as that is commonly right.

 

INPUTS: cats and

<SOS> cats and the at the <EOS>

 

 But if you notice that none of the sentences starts with birds, I set the inputs to “the birds”, and it seems to prefer to continue the sentence with “nest” or “planes”. I think this shows some sense of semantic formulation. i.e., the words it uses in the next token begin to look like a better fit.

 

I think it would be interesting to investigate when certain token combinations come together.

 

Though this also means that this week of testing and learning was a bit of a washout: if you use cross-entropy, the issue is that you can think the AI is getting better, but you're not looking at whether other heads are even higher, which would make the AI wrong. Though if you start measuring those other heads,s they introduce the wrong sort of information.

 

Conclusion 

 

I think this is going to be slow going, I really do not know much I am currently working through testing on the layer norm. I have improved the speed of the code already and trying a few different formulations to settle my thinking on how it really works. 

 

Though the series of mistakes with wha i have buit so far really settled my thoughts that theres something wrong with how we describe AI.

 

What I learned, and this informed my philosophy of AI.

 

The thing is with this week of experiment,s I was trying to over-train the AI to perfectly match “cats and” to “ cats and dogs are pets”.

 

The thing is, it never did this; instead, I found the inner workings are much more stochastic. In some sense, the way the transformer learns is not a straight path; you can intuit that learning is taking place, and you can show growth in learning if you use some heuristics like cross-entropy (though no single one will be perfect). There seems to be some debate over whether they are intelligent; the issue is that the standard design has no real memory or recurrence of thought*. 

So, what does that mean intelligence without thought?

 

I think I intuit that, internally, the position and token embeddings create a fuzzy, multidimensional numeric representation of a query (i.e., what you write). This fuzzy combination represents the whole sentence but possibly conceals or noises the individual words.

Combining input embeddings with positional embeddings acts as controlled noise on the sentence, so that each word. That sentence as an input is passed to attention heads that can split out the input into a series of heads that distinctly learn different “annotations” to that query and pass it along, and feed forward can then act to cut down that annotation where excessive and the layer norm can stop feedback from being explosive. The issue is that, at no point, does that description require understanding, and I point to the difficulty of perfectly mapping a query to an output as evidence of that.

The AI builds up and tears down possibly a set of cliffs notes that represents pointers to the output that make some things as outputs more likely than others (I think).

I also note that teacher training and how the AI learns mean it learns the next 20 tokens (i.e., the next 20 words). That user's understanding is that it predicts the next token at inference, and yes, that is true at inference: a much smaller part of the original AI is used to predict it. The issue is that, during training, the AI learns with many heads that collectively encode the entire vector space of future tokens, and it takes many iterations to embed that learning into the weights.

That training on a whole sentence means you have one input and one output but too much learning as seen above would not let you settle into the correct representation of input to output therefore there is very little doubt to me that the transformer could ever learn the precise answer to a very specific question but somehow by layered fuzzy matching it appears to get mostly there.

At no point does the AI experience input and output as a continuum. The feeling of AI writing flowing text is possibly some oddity because, actually, there's no AI, even when teacher training is used to predict the whole sentence, some part of that sentence will be wrong and fuzzy, so you do have to crank out the next token to sort of correct the right output. 

 

My feeling is that the output is creating a fuzzy representation of data, such that actually the AI never learns the exact answer to “cats and” to finish it with “ cats and dogs are pets”. It is never certain, therefore, in an epistemological sense, that it can know that A means B. I feel that what is happening is in some sense a compression of information into the weights and out at inference, with no epistemological certainty that A must mean B.

There is an oddness here that layered fuzzy representations though do not seem to be able to be brute forces into exactly hard coding stimulus with response there is the phenomenon of one shot learning where given enough statistical representations across many domains and theatres it does get to that learning.

In a strange way building one has put me in a hard no this is not proper intelligence camp and yet very starkly you sharply see exactly why people think that because there is this simulacra of something.

Two comparisons could be made; the first is the common retort that the AI is a stochastic parrot. The issue is that, while the temperature function in the AI is often described as stochastic, it doesn't appear to behave that way. No modulus operations take place and it is decidedly unlike any pseudo random number generators ive built neither tracking seed numbers to be described as stochastic and upon reading being clearly capable of being deterministic except for the precision issue of floats within C++ lose accuracy on calculations of less than 1-e7 such that the randomness seems to arise from small differences in hardware and the dropout in precision.

 

If given infinite precision at the hardware level, it is hard to see how the temperature function would be random at all. Therefore, within the technical meaning of a thing being stochastic, i.e. producing a random list of numbers using a mathematical function to act upon a set of hidden states sequentially, is wrong because the transformer architecture has hidden states, it acts sequentially, not from those hidden states (its weights) but from the input.

 

We often discuss Gen AI as a replacement for search, but if we accept here that the AI is trained by repeated exposure to different data sets and combines that into a fuzzy matching between the input query and the output query, then, in a sense, you have never searched for any specific information. It is often described as the AI having learned the average representation of the prompt and answer. Still, while that might be good enough for us to get an answer, it's not quite semantically correct, since nothing specific was searched for and matched against, but rather, a fuzzy representation of a training set was recreated.

 

If it were this fuzzy match, then it explains why the gap between a given input and output cannot be narrowed to zero error and absolute certainty then it is because the token and position embeddings make a fuzzy representation of its inputs and teacher training creates a fuzzy vector embedding of the future. I search for words to describe what i think happens here but i think its description is outside modern parlance and my concern is we have missed the AI designs we are using are stranger than a simple expression would describe.

 

Then people typically talk about AI agents or agentic AI. Though this seems an odd concept to apply to something we accept as having difficulty even being trained to predictively “cats and” to “cats and dogs are pets”.An agent implies representation, with certain moral rights to act on behalf of another. It may seem odd to say it acted stochastically, but there is potential, since the gap between pseudo-random number generators and real ones is not overwhelming. Likewise, we have no approach to business operations that would allow us to estimate the error rate for our human agents losing the plot and hallucinating, or otherwise, as we are used to a given human. 

 

Further, if we take the logic by which we arrived at the conclusion that transformer AI does not, in fact, search in data like search, it is hard to see how intent or behaviours are necessary for us to infer that they behave according to a set of conduct or responsibilities. 

 

It feels to me that agent and agentic require some sense of internal representation that changes as the transaction takes place, such that linguistically we would say actions are carried out, and that implies some movement, and many words we use about intelligence imply internal change and movement, which is wholly absent in the basic transformer and only moderately with some memory systems added. Therefore, I feel uneasy about calling something an agent and implying it acted in the world, when I know that wording implies a sense of action that is greater than the internal context allows.

 

All in all, this feels like a language trap that we are invited to either cognitively grant AI full intelligence or not grant it any intelligence at all. Our language conveys statefulness with agency and moral authority with intelligence. The issue might be that the two are divorced here.

That therefore the quote at the start feels important. Can you have inteligence without emotion well a transformer feels like it challenges that idea as where is the emotion inside it?

AI is not quite any of these things being search or agents; therefore some would dismiss it is not really anything not fulfilling any given category. Though AI keeps astounding me with the quality of its actions from one shot learning. If we assume that all these emergent behaviours are, in some sense, applications of one-shot learning techniques, where the AI applies statistical models learned from its training data to infer a scaffolding so that it acts as an agent or responds as if it were searching a database, even though it is, in some sense, rebuilding a fuzzy match from its weights and not doing any of those things.

 

There is then an interesting thought: clearly, our language and expectations do not really map well to describing this AI architecture and perhaps an important thing to do would be to invent a better vocabulary, because while this is not search, it is not an agent; it is definitely something.

The challenge will be where does that something sit what areas will it be useful in. I think currently there is only one scaled business model of big AI through a API. I just sort of feel we might be doing this wrong.



*(neuro-symbolic and reasoner models not withstanding and treated as a separate object, though even then I think it might treat that as internal data annotation, not memory or recurrence of thought as we think about those concepts in ourselves). 

 

Appendix A

 

#This is my grab bag of tests with a visualisation function if you need to graph them quickly 

 

def line_graph(data_set):

 

    import seaborn as sns

    import matplotlib.pyplot as plt

    import pandas as pd

 

    lisp={}

    count=0

    maxi=0

    for data in data_set:

        if len(data) > maxi:

            maxi=len(data)

    mini=maxi

    for data in data_set:

        if len(data) < mini:

            mini=len(data)

 

    for data in range(len(data_set)):

        data_set[data]=data_set[data][:mini]

    lisp['x']=list(range(mini))

    for data in data_set:

        count+=1

        lisp["Series "+str(count)]=data

    df=pd.DataFrame(lisp)

    df_melted=df.melt(id_vars='x',var_name="Series",value_name='Value')

 

    sns.set_style("whitegrid")

    plt.figure(figsize=(10,5))

    sns.lineplot(data=df_melted,x='x',y='Value',hue='Series',marker='o',markersize=8,linewidth=2.5)

    plt.xlabel("Index",fontsize=14)

    plt.ylabel("Values",fontsize=14)

    plt.show()

        

 

    df=pd.DataFrame()

 

#const float learning_rate = 0.1f;

#const float clip_value = 1.0f;

#INPUTS:cats and

#<SOS> cats and night <EOS>

#INPUTS:the birds

#<SOS> the birds <EOS>

value1=[26.226774,25.701759,25.714996,25.742743,25.759310,25.758923,25.733387,25.728062,25.731478,25.704405,25.763687,25.688051,25.713028,25.737524,25.695087,25.711500,25.704836,25.692451,25.738262,25.674992,25.694241,25.663145,25.694832,25.707148,25.670290,25.718435,25.713039,25.672068,25.728395,25.686209,25.689964,25.715584,25.698694,25.686705,25.658905,25.660103,25.682713,25.718739,25.688805,25.714941,25.694094,25.699272,25.677120,25.716221,25.700186,25.694572,25.704815,25.706230,25.704372,25.652359,25.728872,25.667116,25.716627,25.659765,25.740231,25.647087,25.725023,25.658463,25.741636,25.671240,25.680092,25.658659,25.726645,25.693377,25.714062,25.653708,25.671448,25.700102,25.684483,25.665049,25.677147,25.696060,25.707174,25.679018,25.675726,25.710423,25.716873,25.620930,25.692577,25.703234,25.672249,25.733526,25.613436,25.704346,25.684059,25.722448,25.663067,25.680273,25.722212,25.693602,25.653986,25.646317,25.731791,25.689953,25.703741,25.656977,25.699322,25.627682,25.665091,25.685101]

 

#const float learning_rate = 0.1f;

#const float clip_value = 0.5f;

#INPUTS:cats and

#<SOS> cats and flew over the on shines the the flew sat

#INPUTS:the birds

#<SOS> the birds are on the <SOS> <EOS>

#INPUTS:cats and

#<SOS> cats and a plane mat <EOS>

#INPUTS:the birds

#<SOS> the birds planes <EOS>

value2=[26.181168,25.690424,25.729572,25.673862,25.835621,25.770815,25.695261,25.748415,25.701429,25.747629,25.681509,25.770033,25.639421,25.710543,25.675377,25.711670,25.686592,25.761076,25.714931,25.658760,25.728176,25.683060,25.706488,25.744888,25.679209,25.653870,25.666479,25.745337,25.721312,25.728321,25.700623,25.692337,25.639988,25.716379,25.723486,25.706869,25.704271,25.703001,25.699255,25.664965,25.629135,25.645628,25.733843,25.635344,25.720863,25.707874,25.720490,25.709137,25.702002,25.688171,25.681549,25.684797,25.719183,25.684486,25.705130,25.642893,25.687969,25.670296,25.711288,25.645281,25.712120,25.705004,25.671196,25.689182,25.647505,25.705200,25.680365,25.684982,25.689552,25.718246,25.638453,25.730288,25.687006,25.652149,25.682180,25.723263,25.703318,25.653503,25.700041,25.652063,25.675575,25.693222,25.710365,25.627613,25.687899,25.710182,25.702644,25.653826,25.667048,25.689928,25.678200,25.686251,25.638573,25.661963,25.724972,25.702410,25.673536,25.629120,25.711605,25.668434]

 

#const float learning_rate = 0.01f;

#const float clip_value = 1.0f;

#<SOS> cats and the sun sky glows <EOS>

#<SOS> cats and over in the <PAD> sky city <EOS>

#<SOS> cats and birds city <EOS>

#<SOS> cats and cat the the log <EOS>

#<SOS> cats and sky the dog the <EOS>

#<SOS> cats and dogs the <EOS>

#<SOS> cats and city the city <EOS>

#INPUTS:the birds

#<SOS> the birds flew the the nest <EOS>

#<SOS> cats and the moon the at <EOS>

#INPUTS:the birds

#<SOS> the birds sky the the the nest on log on bird

#INPUTS:the birds

#<SOS> the birds and fly <EOS>

#<SOS> cats and over the <EOS>

#<SOS> the birds night city sky the the the the over the

#<SOS> the birds on nest <EOS>

#<SOS> cats and dogs <EOS>

#INPUTS:the birds

#<SOS> the birds over the nest city <EOS>

#<SOS> cats and the flew the log sky fly <EOS>

#<SOS> cats and over the dogs the <EOS>

 

value3=[27.515440,26.628044,25.734200,25.539015,25.647160,25.680950,25.657923,25.664593,25.668602,25.738916,25.667194,25.638659,25.640858,25.634600,25.780935,25.660576,25.755518,25.616585,25.645792,25.830366,25.723436,25.609268,25.740021,25.652304,25.752956,25.733223,25.672405,25.799860,25.707870,25.793587,25.724186,25.806486,25.712740,25.732367,25.738176,25.736868,25.649178,25.755104,25.768633,25.693707,25.700121,25.664139,25.661688,25.815422,25.722462,25.733938,25.675703,25.738979,25.709452,25.656603,25.699940,25.740471,25.721142,25.655142,25.662174,25.690855,25.733179,25.670712,25.684280,25.698597,25.716646,25.686440,25.717731,25.679331,25.698288,25.714241,25.729916,25.718979,25.694777,25.662954,25.648039,25.700121,25.715860,25.691290,25.671495,25.749743,25.695263,25.701941,25.733755,25.663649,25.682989,25.683771,25.718260,25.645815,25.710592,25.737827,25.668808,25.679247,25.666748,25.704668,25.659662,25.706909,25.661619,25.689299,25.737139,25.680632,25.631901,25.683937,25.705719,25.643236]



#const float learning_rate = 0.01f;

#const float clip_value = 0.5f;

 

value4=[27.405680,26.162703,25.418596,25.663071,25.699001,25.594946,25.670521,25.667967,25.633636,25.667458,25.679482,25.687418,25.675112,25.613466,25.719805,25.749428,25.667475,25.693193,25.725777,25.752304,25.774834,25.811039,25.771242,25.635426,25.637852,25.615829,25.747805,25.640455,25.754374,25.667435,25.686047,25.836296,25.708590,25.723330,25.651308,25.729488,25.738344,25.779291,25.764896,25.741949,25.683998,25.737751,25.699373,25.695721,25.646036,25.712177,25.646519,25.705647,25.665615,25.589195,25.628305,25.666752,25.668684,25.735542,25.700396,25.714306,25.719734,25.739098,25.718998,25.691771,25.633966,25.695120,25.669527,25.719898,25.699270,25.706776,25.710192,25.720734,25.686539,25.704451,25.692194,25.689613,25.708881,25.714888,25.676142,25.668346,25.703115,25.697540,25.720797,25.662491,25.646084,25.752934,25.647690,25.654613,25.739658,25.670511,25.714012,25.651772,25.683043,25.723600,25.653679,25.685343,25.646137,25.724932,25.661434,25.699223,25.684637,25.700840,25.650129,25.664631]



#const float learning_rate = 0.001f;

#const float clip_value = 1.0f;

#<SOS> cats and <UNK> pets <EOS>

#<SOS> the birds planes flew on in the planes nest <EOS>

#<SOS> cats and dogs on are <EOS>

 

value5=[27.708464,27.576506,27.330059,26.940651,26.463181,26.011755,25.731647,25.584883,25.457909,25.424345,25.445610,25.473658,25.558762,25.649807,25.698174,25.749683,25.787193,25.769241,25.775158,25.734152,25.739435,25.707245,25.723719,25.674477,25.693583,25.677774,25.642973,25.651213,25.620050,25.630676,25.616451,25.638842,25.643913,25.635878,25.629280,25.635826,25.644199,25.615149,25.636217,25.678980,25.622744,25.647797,25.686234,25.644398,25.666290,25.610315,25.661289,25.687784,25.644361,25.713903,25.638882,25.671160,25.675053,25.674179,25.666794,25.654612,25.639036,25.696268,25.617657,25.700581,25.670330,25.660374,25.662519,25.682364,25.648180,25.704678,25.681705,25.681181,25.656963,25.654154,25.679382,25.726156,25.619831,25.703568,25.691046,25.629770,25.623592,25.731501,25.696110,25.643248,25.682529,25.705917,25.649511,25.724731,25.695526,25.682467,25.658518,25.664610,25.699081,25.626112,25.667290,25.715675,25.678991,25.696842,25.588512,25.718939,25.697763,25.749317,25.666672,25.674713]



#const float learning_rate = 0.001f;

#const float clip_value = 0.5f;

#<SOS> cats and mat on moon at the <EOS>

#<SOS> cats and on night the shines city planes <EOS>



value6=[27.708782,27.630318,27.498140,27.284555,26.965952,26.570614,26.202991,25.933418,25.783600,25.765326,25.717745,25.632391,25.587326,25.583286,25.617723,25.623838,25.638706,25.654982,25.639378,25.668079,25.700226,25.675215,25.662548,25.705908,25.668640,25.676424,25.671131,25.687456,25.684557,25.690907,25.673069,25.670683,25.673819,25.625751,25.701426,25.658289,25.637264,25.621685,25.691271,25.604307,25.660316,25.635794,25.686478,25.643383,25.691278,25.599079,25.692387,25.639124,25.647766,25.690201,25.635729,25.629627,25.704012,25.619204,25.689137,25.593306,25.702782,25.656689,25.676079,25.638269,25.680857,25.639570,25.723698,25.684286,25.672779,25.701878,25.648329,25.724628,25.647314,25.667202,25.715466,25.657230,25.714502,25.600248,25.714560,25.660620,25.657026,25.625420,25.707357,25.655823,25.649923,25.630295,25.687574,25.685053,25.690886,25.643049,25.742393,25.637644,25.735563,25.684940,25.686495,25.659883,25.676949,25.699429,25.720873,25.667372,25.728666,25.666754,25.721998,25.673717]



#const float learning_rate = 0.0001f;

#const float clip_value = 1.0f;



value7=[27.766819,27.760958,27.752966,27.742641,27.729431,27.713276,27.693167,27.668459,27.638058,27.601259,27.556648,27.501892,27.435314,27.355574,27.261074,27.145397,27.012743,26.856989,26.682219,26.487377,26.274273,26.051233,25.826984,25.602077,25.394279,25.196283,25.018188,24.868286,24.737692,24.635431,24.552223,24.493700,24.449854,24.430548,24.421106,24.428102,24.445541,24.475281,24.517727,24.570244,24.635687,24.704912,24.790323,24.880068,24.985641,25.088531,25.204664,25.323294,25.438807,25.553986,25.667109,25.777573,25.883564,25.975878,26.064793,26.139479,26.206539,26.261599,26.310452,26.346771,26.376860,26.398972,26.412275,26.419197,26.417416,26.409477,26.394962,26.372601,26.347792,26.316919,26.278917,26.239544,26.196550,26.152634,26.104618,26.053396,26.003880,25.952515,25.905558,25.859356,25.816679,25.773018,25.729073,25.697578,25.664610,25.639303,25.618969,25.598824,25.582766,25.566757,25.559771,25.554249,25.547752,25.551313,25.555525,25.555220,25.570843,25.571615,25.580206,25.591257]



#<SOS> cats and the the the fly the the flew the the

#const float learning_rate = 0.0001f;

#const float clip_value = 0.5f;

 

value8=[27.769815,27.764240,27.756523,27.746569,27.733582,27.717825,27.698030,27.673559,27.643919,27.607346,27.562559,27.508526,27.440769,27.359871,27.261538,27.143778,27.003328,26.841305,26.654877,26.451254,26.232700,26.008461,25.783030,25.567377,25.370464,25.198395,25.051783,24.939013,24.849373,24.794178,24.768173,24.751310,24.773571,24.794157,24.826452,24.885805,24.931606,24.976830,25.034931,25.094931,25.145485,25.201294,25.244974,25.303888,25.358341,25.418388,25.476374,25.539568,25.602646,25.665361,25.725580,25.795111,25.848356,25.915665,25.967539,26.020649,26.059719,26.105295,26.145058,26.173489,26.196804,26.213900,26.229027,26.239614,26.243959,26.246702,26.240959,26.229605,26.219368,26.203556,26.186049,26.162601,26.138527,26.111612,26.084023,26.054831,26.028585,26.000103,25.967192,25.941290,25.908800,25.883202,25.856483,25.824997,25.805498,25.787142,25.767101,25.743729,25.727011,25.715267,25.692165,25.681070,25.667488,25.649666,25.649767,25.636330,25.628098,25.617790,25.621332,25.609030]

 

#Fixed gradient clipping

#const float learning_rate = 0.0001f;

#const float clip_value = 0.5f;

 

value9=[27.830788,27.827406,27.823013,27.817240,27.810329,27.801233,27.790476,27.777100,27.761238,27.741917,27.718111,27.689476,27.654528,27.611000,27.557966,27.495077,27.416386,27.320770,27.207685,27.072239,26.918713,26.737972,26.538095,26.323174,26.096359,25.865147,25.649794,25.434059,25.243210,25.075291,24.931078,24.813225,24.717325,24.646139,24.598694,24.551926,24.537172,24.527769,24.540039,24.559010,24.582825,24.619902,24.666805,24.718214,24.780117,24.847811,24.926607,25.005905,25.093988,25.180168,25.267340,25.358871,25.443504,25.526989,25.613558,25.688477,25.755445,25.825266,25.876507,25.931126,25.974276,26.012135,26.037216,26.062717,26.080585,26.092644,26.101704,26.098864,26.102385,26.094423,26.082645,26.069334,26.050470,26.028591,26.010244,25.988218,25.962126,25.940138,25.912319,25.889181,25.863464,25.836180,25.811113,25.786905,25.769413,25.750570,25.728935,25.712339,25.698776,25.676373,25.667236,25.654200,25.643000,25.635946,25.629021,25.611816,25.617819,25.604780,25.604174,25.601402]





value10=[27.796680,27.796234,27.795658,27.794912,27.794010,27.792929,27.791769,27.790354,27.788830,27.787088,27.785280,27.783184,27.780941,27.778429,27.775843,27.772972,27.769966,27.766693,27.763157,27.759445,27.755522,27.751272,27.746744,27.742023,27.737091,27.731497,27.725739,27.719927,27.713343,27.706678,27.699425,27.691931,27.683666,27.675156,27.666525,27.656597,27.646610,27.635801,27.624540,27.612741,27.599878,27.586599,27.572414,27.557402,27.541443,27.524544,27.506506,27.487898,27.468040,27.446751,27.424530,27.400486,27.376293,27.349291,27.321283,27.292006,27.260399,27.228064,27.192869,27.156605,27.117388,27.076273,27.033464,26.989414,26.941465,26.891727,26.839718,26.786261,26.729006,26.669754,26.608452,26.544323,26.477942,26.409800,26.338909,26.266228,26.192022,26.115295,26.036882,25.957090,25.875746,25.795267,25.712185,25.629528,25.545828,25.462549,25.380270,25.296610,25.215046,25.134415,25.054564,24.977072,24.900740,24.826717,24.754623,24.685139,24.617203,24.552263,24.490725,24.431009]

 

value11=[27.807413,27.807381,27.807344,27.807295,27.807241,27.807171,27.807085,27.807003,27.806900,27.806780,27.806658,27.806528,27.806385,27.806211,27.806046,27.805878,27.805691,27.805481,27.805269,27.805046,27.804810,27.804564,27.804306,27.804029,27.803759,27.803453,27.803146,27.802824,27.802500,27.802174,27.801815,27.801447,27.801071,27.800694,27.800266,27.799860,27.799431,27.798990,27.798546,27.798073,27.797606,27.797112,27.796589,27.796074,27.795559,27.795004,27.794441,27.793871,27.793259,27.792652,27.792034,27.791426,27.790785,27.790104,27.789402,27.788717,27.788012,27.787296,27.786549,27.785770,27.784996,27.784225,27.783409,27.782595,27.781752,27.780880,27.780018,27.779114,27.778210,27.777267,27.776276,27.775320,27.774317,27.773296,27.772282,27.771235,27.770153,27.769056,27.767954,27.766798,27.765680,27.764435,27.763260,27.762020,27.760767,27.759495,27.758188,27.756863,27.755535,27.754162,27.752708,27.751345,27.749893,27.748360,27.746912,27.745340,27.743771,27.742176,27.740520,27.738901,27.737185,27.735491,27.733719,27.731939,27.730093,27.728287,27.726381,27.724501,27.722460,27.720524,27.718468,27.716457,27.714361,27.712259,27.710079,27.707764,27.705576,27.703230,27.700897,27.698465,27.696070,27.693533,27.691053,27.688562,27.685932,27.683203,27.680502,27.677837,27.674969,27.672192,27.669147,27.666201,27.663198,27.660130,27.656961,27.653820,27.650549,27.647261,27.643866,27.640419,27.636976,27.633339,27.629757,27.625996,27.622286,27.618448,27.614672,27.610624,27.606562,27.602394,27.598173,27.593973,27.589605,27.585196,27.580622,27.576120,27.571453,27.566633,27.561695,27.556900,27.551754,27.546761,27.541445,27.536190,27.530767,27.525307,27.519753,27.514101,27.508011,27.502232,27.496229,27.490139,27.483891,27.477789,27.471169,27.464518,27.457724,27.451057,27.444170,27.436989,27.430014,27.422676,27.415159,27.407614,27.399948,27.391962,27.383989,27.375971,27.367832,27.359407,27.350973,27.342142,27.333361,27.324436,27.315258,27.305998,27.296535,27.286825,27.277330,27.267376,27.257221,27.247169,27.236589,27.225933,27.215412,27.204317,27.193386,27.182310,27.170774,27.159081,27.147268,27.135372,27.123228,27.110760,27.098364,27.085373,27.072834,27.059496,27.046717,27.032906,27.019306,27.005400,26.991465,26.977522,26.963057,26.948179,26.933416,26.918598,26.903008,26.887856,26.872252,26.856462,26.840361,26.824505,26.807833,26.791361,26.774393,26.757547,26.740704,26.722847,26.705450,26.687838,26.669920,26.651943,26.633593,26.614986,26.596189,26.577721,26.558327,26.539017,26.519976,26.500368,26.480755,26.461264,26.441229,26.420710,26.400745,26.379602,26.359650,26.338114,26.317366,26.296669,26.275360,26.254171,26.232435,26.211481,26.188927,26.167250,26.145712,26.124134,26.101517,26.079950,26.057018,26.034678,26.012413,25.989555,25.967844,25.944569,25.922493,25.899529,25.877054,25.853941,25.830936,25.808504,25.785358,25.762503,25.739643,25.717026,25.693922,25.671152,25.648125,25.625111,25.602079,25.580200,25.557293,25.534777,25.511272,25.489117,25.466429,25.444220,25.421915,25.399424,25.377077,25.354921,25.332930,25.311275,25.289164,25.267662,25.245476,25.224569,25.203247,25.181368,25.160694,25.140039,25.118731,25.098436,25.077574,25.057051,25.037043,25.017191,24.997154,24.977098,24.957537,24.938328,24.918921,24.900171,24.881104,24.862827,24.844009,24.825529,24.807503,24.790012,24.771927,24.754471,24.736778,24.719643,24.703115,24.686363,24.669434,24.653542,24.637054,24.621622,24.605810,24.590012,24.574749,24.559771,24.544979,24.530333,24.515903,24.501543,24.487829,24.473946,24.460354,24.446833,24.433573,24.420698,24.407932,24.395193,24.383360,24.371109,24.359228,24.347630,24.336346,24.324785,24.314171,24.303152,24.292572,24.281933,24.271881,24.261858,24.251871,24.242533,24.233089,24.223955,24.214920,24.206276,24.197569,24.189180,24.180962,24.173227,24.165308,24.157557,24.150301,24.142962,24.135874,24.129108,24.122379,24.116030,24.109291,24.103167,24.097469,24.091787,24.086058,24.080898,24.075460,24.070700,24.065948,24.061029,24.056557,24.052437,24.047884,24.043982,24.040087,24.036524,24.033285,24.029879,24.026550,24.023439,24.020622,24.017700,24.015652,24.013254,24.010891,24.008705,24.006971,24.005455,24.003822,24.002260,24.001093,23.999731,23.999073,23.997963,23.997356,23.996717,23.996338,23.996462,23.996126,23.996107,23.996389,23.996706,23.996929,23.997820,23.998257,23.999500,24.000530,24.001652,24.002773,24.004482,24.006027,24.007851,24.009668,24.011698,24.013704,24.016018,24.018387,24.020813,24.023590,24.026400,24.029432,24.032610,24.035721,24.038946,24.042374,24.046156,24.049768,24.053793,24.057768,24.062185,24.066338,24.070631,24.075188,24.079779,24.084663,24.089386,24.094700,24.099758,24.105190,24.110672,24.116219,24.122177,24.128056,24.133982,24.140102,24.146383,24.152788,24.159321,24.166103,24.172913,24.179867,24.187080,24.194307,24.201742,24.209192,24.216820,24.224741,24.232437]



value12=[55.829765,55.829605,55.829430,55.829182,55.828896,55.828575,55.828186,55.827747,55.827271,55.826740,55.826130,55.825500,55.824821,55.824055,55.823269,55.822411,55.821468,55.820511,55.819469,55.818390,55.817245,55.816055,55.814762,55.813435,55.812027,55.810574,55.809032,55.807411,55.805737,55.803974,55.802151,55.800266,55.798298,55.796173,55.794037,55.791824,55.789478,55.787018,55.784538,55.781857,55.779148,55.776295,55.773388,55.770359,55.767101,55.763866,55.760422,55.756886,55.753242,55.749447,55.745399,55.741352,55.737034,55.732689,55.727936,55.723267,55.718273,55.713108,55.707767,55.702251,55.696564,55.690529,55.684269,55.677818,55.671188,55.664253,55.657009,55.649578,55.641685,55.633751,55.625191,55.616550,55.607517,55.598118,55.588253,55.578041,55.567490,55.556404,55.544876,55.533039,55.520630,55.507675,55.494122,55.480305,55.465736,55.450710,55.435078,55.418594,55.401676,55.383659,55.365425,55.346329,55.326336,55.305676,55.283802,55.261375,55.238194,55.214130,55.188549,55.162384,55.135239,55.106869,55.077408,55.047146,55.015495,54.982559,54.948261,54.913036,54.876183,54.838432,54.799267,54.758286,54.716312,54.672333,54.627327,54.581089,54.533501,54.483437,54.433155,54.380646,54.326778,54.271896,54.214211,54.155918,54.096573,54.035488,53.973606,53.909893,53.844078,53.777962,53.710682,53.642662,53.572453,53.502621,53.430950,53.359051,53.286488,53.212887,53.139507,53.065418,52.991219,52.916851,52.842342,52.767750,52.693863,52.619812,52.545540,52.473755,52.400433,52.328732,52.257912,52.188847,52.120190,52.051716,51.985344,51.919353,51.855515,51.792240,51.730831,51.670944,51.611446,51.554634,51.499115,51.445023,51.392200,51.341099,51.292076,51.244080,51.197773,51.152580,51.109535,51.067528,51.027252,50.987957,50.950615,50.914951,50.879700,50.846283,50.814209,50.783470,50.753796,50.725594,50.698280,50.672054,50.646973,50.622959,50.599789,50.577965,50.556873,50.536583,50.517300,50.498657,50.481079,50.464355,50.448097,50.432640,50.417980,50.403717,50.390411,50.377567,50.365326,50.353580,50.342464,50.331844,50.321571,50.311951,50.302601,50.293896,50.285412,50.277466,50.269844,50.262569,50.255653,50.249046,50.242859,50.236904,50.231216,50.225807,50.220695,50.215832,50.211227,50.206871,50.202694,50.198811,50.195110,50.191628,50.188374,50.185261,50.182297,50.179516,50.176891,50.174675,50.172394,50.170254,50.168358,50.166550,50.164700,50.163395,50.162029,50.160614,50.159634,50.158764,50.157829,50.157013,50.156265,50.155869,50.155392,50.155067,50.155159,50.155106,50.155140,50.154915,50.155613,50.156082,50.156284,50.157017,50.157646,50.158340,50.159634,50.160667,50.161335,50.162350,50.163990,50.165874,50.167126,50.169014,50.170761,50.172844,50.174999,50.176865,50.179272,50.181293,50.184410,50.186592,50.189831,50.193367,50.196209,50.199196,50.202496,50.205917,50.209381,50.213570,50.216888,50.221329,50.225887,50.230431,50.234375,50.239643,50.245708,50.249660,50.255192,50.259491,50.266697,50.272655,50.278931,50.284813,50.290794,50.297607,50.304623,50.312504,50.319271,50.326653,50.332710,50.341362,50.352047,50.358768,50.365780,50.374306,50.384239,50.393234,50.403450,50.411644,50.421326,50.432697,50.442238,50.451767,50.463120,50.472149,50.484482,50.493053,50.506374,50.514503,50.528656,50.537308,50.553631,50.564362,50.573040,50.587791,50.599121,50.608429,50.622005,50.631969,50.645004,50.659302,50.669125,50.679707,50.692898,50.706581,50.716087,50.729565,50.743084,50.751659,50.763866,50.775589,50.789051,50.799274,50.811657,50.819504,50.831707,50.840591,50.850544,50.864262,50.870998,50.882156,50.891720,50.899967,50.909245,50.919189,50.926994,50.934795,50.943649,50.953106,50.960060,50.969608,50.974586,50.982613,50.990036,50.995190,51.002953,51.007824,51.013752,51.020004,51.026611,51.031307,51.036125,51.039997,51.045635,51.049465,51.052879,51.057514,51.061211,51.062935,51.066998,51.069439,51.072227,51.074009,51.076775,51.078304,51.079960,51.082989,51.082798,51.084476,51.084988,51.085102,51.086262,51.086575,51.086197,51.086845,51.086529,51.085876,51.084915,51.084526,51.083752,51.082424,51.081013,51.079815,51.078117,51.076279,51.074738,51.072403,51.070145,51.067772,51.065105,51.062576,51.059975,51.056988,51.054131,51.050888,51.047585,51.044388,51.040844,51.037457,51.033844,51.030003,51.026199,51.022335,51.018383,51.014267,51.010292,51.006134,51.001892,50.997589,50.993248,50.988903,50.984383,50.979942,50.975552,50.970863,50.966423,50.961861,50.957180,50.952511,50.948055,50.943443,50.938725,50.933956,50.929447,50.924854,50.920380,50.915615,50.911285,50.906639,50.902195,50.897823,50.893410,50.888741,50.884548,50.880527,50.875954,50.871731,50.867302,50.863277,50.858841,50.855442,50.851379,50.847336,50.843529,50.839306,50.835724,50.832024,50.828537,50.825111,50.822132,50.818501,50.815315,50.812370,50.808334,50.804520,50.802311,50.798496,50.796490,50.793018,50.790771,50.787266,50.783978,50.782364,50.778446,50.777737,50.773136,50.771381,50.769405,50.767487,50.766144,50.762951,50.760139,50.759594,50.757160,50.755329,50.751923,50.752155,50.749344,50.748699,50.746399,50.745010,50.743694,50.743462,50.742191,50.739597,50.737293,50.736778,50.736244,50.734070,50.735935,50.731991,50.730789,50.730907,50.729061,50.729324,50.727192,50.728432,50.725784,50.725494,50.727077,50.726887,50.724091,50.722839,50.723869,50.723904,50.720097,50.719837,50.720943,50.718632,50.720940,50.719063,50.720387,50.719902,50.720078,50.720104,50.720741,50.718597,50.717258,50.718040,50.718540,50.716896,50.716633,50.716133,50.716515,50.719574,50.719181,50.716316,50.717339,50.717896,50.717419,50.716633,50.717770,50.717529,50.720242,50.717255,50.717659,50.715511,50.714890,50.716942,50.718327,50.718086,50.716099,50.716515,50.717045,50.715992,50.720318,50.718418,50.719616,50.720200,50.716019,50.717888,50.717438,50.717758,50.717339,50.717541,50.718979,50.720928,50.719513,50.720249,50.720764,50.722202,50.717979]

 

value13=[27.820282,27.820259,27.820238,27.820202,27.820164,27.820122,27.820068,27.820009,27.819942,27.819866,27.819790,27.819698,27.819613,27.819510,27.819401,27.819290,27.819164,27.819042,27.818905,27.818758,27.818615,27.818455,27.818289,27.818123,27.817930,27.817753,27.817562,27.817356,27.817150,27.816929,27.816708,27.816469,27.816227,27.815990,27.815727,27.815460,27.815189,27.814903,27.814619,27.814320,27.814001,27.813688,27.813364,27.813032,27.812693,27.812330,27.811977,27.811598,27.811211,27.810827,27.810436,27.810009,27.809591,27.809158,27.808716,27.808268,27.807798,27.807320,27.806839,27.806332,27.805822,27.805313,27.804775,27.804226,27.803682,27.803108,27.802536,27.801928,27.801306,27.800682,27.800041,27.799400,27.798721,27.798037,27.797363,27.796633,27.795912,27.795158,27.794415,27.793634,27.792847,27.792055,27.791204,27.790367,27.789513,27.788631,27.787718,27.786804,27.785858,27.784906,27.783916,27.782923,27.781887,27.780848,27.779768,27.778694,27.777582,27.776461,27.775282,27.774103,27.772894,27.771669,27.770384,27.769075,27.767752,27.766422,27.765030,27.763634,27.762207,27.760736,27.759211,27.757690,27.756092,27.754513,27.752880,27.751221,27.749514,27.747738,27.745974,27.744123,27.742258,27.740349,27.738415,27.736397,27.734388,27.732309,27.730122,27.727976,27.725729,27.723480,27.721144,27.718775,27.716354,27.713869,27.711308,27.708696,27.706078,27.703342,27.700548,27.697712,27.694773,27.691797,27.688759,27.685659,27.682419,27.679232,27.675873,27.672434,27.668999,27.665382,27.661768,27.657949,27.654200,27.650288,27.646273,27.642191,27.637989,27.633717,27.629238,27.624834,27.620255,27.615564,27.610762,27.605867,27.600752,27.595640,27.590384,27.584913,27.579466,27.573832,27.567986,27.562119,27.556049,27.549889,27.543480,27.537054,27.530468,27.523638,27.516729,27.509628,27.502365,27.495047,27.487370,27.479605,27.471680,27.463465,27.455183,27.446712,27.438082,27.429174,27.420046,27.410786,27.401205,27.391546,27.381516,27.371384,27.361126,27.350468,27.339596,27.328568,27.317160,27.305519,27.293797,27.281857,27.269611,27.257004,27.244020,27.231001,27.217646,27.204050,27.190065,27.175829,27.161474,27.146614,27.131580,27.116020,27.100367,27.084440,27.068022,27.051428,27.034670,27.017262,26.999840,26.981737,26.964018,26.945383,26.926550,26.907410,26.887703,26.868170,26.848158,26.827629,26.806812,26.785524,26.763983,26.742485,26.720131,26.697987,26.675140,26.652010,26.628849,26.604950,26.581110,26.556744,26.531996,26.507278,26.481869,26.456263,26.430328,26.404747,26.378271,26.351606,26.324753,26.297331,26.270077,26.242418,26.214352,26.186670,26.158428,26.129484,26.101084,26.072008,26.043095,26.013695,25.984436,25.954504,25.925213,25.895308,25.865667,25.835648,25.805477,25.775019,25.745314,25.714537,25.684296,25.654369,25.623562,25.593464,25.562849,25.532537,25.502459,25.472267,25.442200,25.411911,25.382334,25.352316,25.321890,25.292728,25.262867,25.233372,25.204435,25.175476,25.145815,25.117903,25.089260,25.060738,25.032282,25.004683,24.977127,24.949610,24.922424,24.895741,24.868919,24.842815,24.816694,24.790979,24.765139,24.740107,24.715010,24.690926,24.666300,24.642689,24.618925,24.595362,24.572607,24.550213,24.527662,24.506483,24.484533,24.463669,24.442142,24.422558,24.402157,24.381863,24.363029,24.343529,24.325024,24.306524,24.288414,24.270752,24.253298,24.236408,24.220303,24.203363,24.188269,24.172365,24.157465,24.142841,24.128460,24.114340,24.099953,24.086403,24.073650,24.061502,24.048885,24.035975,24.025127,24.013186,24.002308,23.991488,23.981302,23.971008,23.961374,23.951073,23.942329,23.933437,23.925432,23.916231,23.908249,23.901260,23.893183,23.885941,23.879803,23.872948,23.866085,23.859051,23.853561,23.847404,23.843107,23.837658,23.831795,23.828321,23.823231,23.819565,23.814896,23.810841,23.808527,23.804762,23.802473,23.799095,23.795158,23.792664,23.791201,23.788132,23.786327,23.784838,23.783636,23.781445,23.780785,23.778929,23.778316,23.777983,23.776524,23.775810,23.774673,23.775169,23.775429,23.774782,23.774870,23.775084,23.775354,23.776203,23.775467,23.776876,23.777529,23.777956,23.778881,23.778639,23.781546,23.783291,23.784019,23.785282,23.786634,23.787458,23.788723,23.791029,23.792332,23.794689,23.795315,23.797514,23.799553,23.801533,23.803009,23.804409,23.807732,23.810152,23.811728,23.813358,23.816185,23.818266,23.820810,23.822493,23.825487,23.827185,23.829151,23.831768,23.834446,23.836571,23.839777,23.840616,23.844269,23.846645,23.848211,23.851875,23.854191,23.857193,23.859495,23.862547,23.864765,23.867043,23.870281,23.872450,23.875715,23.878571,23.881062,23.883896,23.886900,23.889311,23.892212,23.894709,23.897987,23.900656,23.903954,23.906429,23.909353,23.912544,23.915777,23.919043,23.921566,23.924667,23.927914,23.931471,23.934763,23.937317,23.940962,23.944069,23.947586,23.950855,23.954250,23.957853,23.961226,23.964855,23.968380,23.971968,23.975756,23.979509,23.983236,23.987127,23.991022,23.994991,23.999027,24.003132,24.007296,24.011494,24.015810,24.020220,24.024443,24.028967,24.033642,24.038376,24.043211,24.048037,24.052711,24.057850,24.063019,24.067814,24.073414,24.078512,24.084194,24.089527,24.095461,24.101076,24.107386,24.112783,24.119303,24.125311,24.131578,24.137863,24.144691,24.151670,24.159168,24.166008,24.172298,24.180050,24.187223,24.194633,24.202461,24.209686,24.217415,24.226418,24.234758,24.242804,24.252081,24.260687,24.268654,24.278214,24.287794,24.297548,24.306681,24.315739,24.326880,24.336922,24.347647,24.357014,24.368526,24.379282,24.390837,24.401278,24.413280,24.424503,24.436337,24.448198,24.459442,24.474834,24.485422,24.498074,24.511518,24.525110,24.538219,24.551525,24.566513,24.578856,24.595127,24.609001,24.623774,24.638792,24.655804,24.670010,24.685501,24.700382,24.715843,24.733932,24.749146,24.765707,24.781981,24.798489,24.815409,24.833902,24.850494,24.869417,24.886353,24.905313,24.922447,24.940359,24.958977,24.976782,24.994520,25.015141,25.034021,25.052227,25.072138,25.092316,25.112091,25.130554,25.151470,25.169922,25.189810,25.211119,25.230158,25.250587,25.271912,25.292097,25.312012,25.332474,25.352491,25.373247,25.393818,25.414803,25.436102,25.456818,25.476851,25.497627,25.518473,25.539694,25.559866,25.579750,25.600626,25.621981,25.641666,25.663225,25.682930,25.703144,25.723696,25.744856,25.764534,25.784210,25.804255,25.824829,25.843851,25.864376,25.883297,25.903139,25.922586,25.941490,25.961153,25.979187,25.999371,26.017347,26.035826,26.054289,26.072472,26.090544,26.108656,26.126755,26.144119,26.162392,26.179407,26.195862,26.213247,26.229919,26.246372,26.263029,26.279261,26.295158,26.311069,26.326868,26.342730,26.357920,26.373100,26.387819,26.402437,26.417196,26.431488,26.445543,26.459688,26.473192,26.486752,26.500223,26.513664,26.526655,26.539576,26.551842,26.564348,26.576859,26.588968,26.600800,26.612385,26.623827,26.635359,26.646292,26.657339,26.668089,26.678539,26.688923,26.699238,26.709257,26.719234,26.728701,26.738411,26.747732,26.756996,26.765993,26.774902,26.783548,26.792456,26.800482,26.808914,26.816866,26.824911,26.832787,26.840254,26.847769,26.855145,26.862507,26.869455,26.876469,26.883314,26.890049,26.896261,26.902765,26.909163,26.915098,26.921017,26.926981,26.932928,26.938467,26.943939,26.949520,26.954666,26.959999,26.964989,26.969843,26.974756,26.979532,26.984150,26.988760]

value14=[27.792612,27.792507,27.792395,27.792246,27.792068,27.791851,27.791605,27.791338,27.791039,27.790701,27.790319,27.789911,27.789488,27.789003,27.788515,27.787977,27.787397,27.786798,27.786160,27.785490,27.784771,27.784021,27.783236,27.782421,27.781555,27.780649,27.779728,27.778721,27.777719,27.776661,27.775555,27.774433,27.773216,27.771994,27.770702,27.769375,27.768000,27.766563,27.765104,27.763559,27.761995,27.760334,27.758656,27.756933,27.755062,27.753193,27.751289,27.749294,27.747192,27.745056,27.742857,27.740612,27.738304,27.735853,27.733318,27.730770,27.728119,27.725370,27.722593,27.719616,27.716667,27.713491,27.710291,27.706957,27.703588,27.699993,27.696362,27.692699,27.688808,27.684793,27.680647,27.676380,27.671951,27.667381,27.662735,27.657890,27.652800,27.647728,27.642447,27.636948,27.631229,27.625401,27.619251,27.613031,27.606527,27.599878,27.592987,27.585831,27.578466,27.570961,27.563000,27.554829,27.546371,27.537798,27.528805,27.519638,27.509991,27.500021,27.489882,27.479225,27.468277,27.456877,27.445318,27.433231,27.420746,27.407722,27.394325,27.380724,27.366493,27.351759,27.336647,27.320889,27.304609,27.288008,27.270792,27.252909,27.234573,27.215590,27.195969,27.175716,27.154779,27.133570,27.111374,27.088640,27.065022,27.041010,27.016003,26.990776,26.964045,26.937132,26.909220,26.880682,26.851223,26.821444,26.790436,26.758694,26.726442,26.693470,26.659279,26.624481,26.589121,26.552984,26.515736,26.478258,26.440071,26.400627,26.360422,26.319715,26.278582,26.236641,26.194674,26.151325,26.107782,26.064007,26.018768,25.973791,25.928822,25.882948,25.836859,25.789873,25.743668,25.696409,25.649654,25.602739,25.555576,25.508413,25.460938,25.414011,25.366961,25.319836,25.272955,25.226950,25.180830,25.135191,25.089878,25.044704,24.999851,24.956129,24.912725,24.869965,24.827393,24.785713,24.744844,24.704542,24.664986,24.626013,24.587854,24.550547,24.514296,24.478252,24.443140,24.409466,24.376194,24.343637,24.311979,24.281567,24.251633,24.222956,24.194704,24.167898,24.141518,24.116083,24.091574,24.067898,24.045271,24.023134,24.002361,23.982113,23.963123,23.944263,23.926889,23.909838,23.893763,23.878824,23.864511,23.850496,23.838017,23.825972,23.814909,23.804434,23.794493,23.785810,23.777100,23.769848,23.763351,23.756706,23.751875,23.746666,23.742342,23.739628,23.736506,23.734636,23.733744,23.732914,23.732414,23.732195,23.733349,23.736385,23.737877]

 

value15=[27.801014,27.800459,27.799885,27.799294,27.798651,27.798008,27.797329,27.796631,27.795906,27.795151,27.794373,27.793566,27.792738,27.791876,27.790989,27.790077,27.789146,27.788168,27.787170,27.786142,27.785093,27.784008,27.782892,27.781744,27.780569,27.779364,27.778124,27.776850,27.775543,27.774197,27.772829,27.771423,27.769979,27.768518,27.766989,27.765448,27.763855,27.762226,27.760586,27.758871,27.757137,27.755371,27.753523,27.751644,27.749767,27.747829,27.745808,27.743792,27.741709,27.739590,27.737402,27.735186,27.732895,27.730553,27.728214,27.725815,27.723303,27.720758,27.718231,27.715546,27.712887,27.710121,27.707331,27.704466,27.701544,27.698532,27.695492,27.692333,27.689167,27.685919,27.682594,27.679222,27.675785,27.672239,27.668625,27.664991,27.661194,27.657391,27.653494,27.649529,27.645456,27.641266,27.637104,27.632784,27.628334,27.623878,27.619312,27.614641,27.609875,27.605021,27.600080,27.595068,27.589844,27.584675,27.579304,27.573885,27.568396,27.562592,27.556913,27.550974,27.545017,27.538834,27.532602,27.526278,27.519800,27.513144,27.506550,27.499567,27.492708,27.485605,27.478331,27.470985,27.463480,27.455996,27.448162,27.440285,27.432236,27.424036,27.415672,27.407206,27.398516,27.389845,27.380846,27.371798,27.362562,27.353258,27.343662,27.333984,27.323992,27.314157,27.303936,27.293468,27.283131,27.272423,27.261707,27.250723,27.239336,27.228008,27.216682,27.204855,27.192999,27.181154,27.168985,27.156424,27.144157,27.131313,27.118467,27.105537,27.092035,27.078814,27.065453,27.051371,27.037806,27.023560,27.009363,26.994894,26.980339,26.965706,26.950743,26.935881,26.920397,26.904972,26.889788,26.874176,26.858366,26.842491,26.826174,26.809858,26.793810,26.777178,26.760481,26.743500,26.726494,26.709869,26.692640,26.675720,26.657669,26.640327,26.622774,26.604853,26.587166,26.569098,26.551563,26.533169,26.515272,26.497145,26.478661,26.460150,26.441702,26.423729,26.404783,26.386354,26.367382,26.349581,26.330305,26.312038,26.292904,26.274353,26.255867,26.237171,26.218163,26.199463,26.180752,26.162340,26.143753,26.125282,26.106150,26.088055,26.069628,26.050571,26.032486,26.014088,25.996029,25.977785,25.960005,25.941532,25.923798,25.905861,25.888588,25.870892,25.853304,25.835682,25.818319,25.801624,25.784096,25.767258,25.750761,25.733807,25.717072,25.701097,25.684719,25.669291,25.652557,25.636782,25.621147,25.605988,25.590351,25.575470,25.560757,25.545256,25.530401,25.517105,25.502048,25.488031,25.474340,25.460569,25.446819,25.433416,25.421589,25.407907,25.394400,25.382248,25.369669,25.357412,25.346428,25.333872,25.321800,25.311016,25.299702,25.288483,25.277393,25.266701,25.255436,25.245943,25.235731,25.225714,25.216396,25.206692,25.197201,25.187702,25.179489,25.170498,25.162804,25.153578,25.145416,25.138027,25.130423,25.122700,25.115690,25.108244,25.101263,25.093613,25.088310,25.081635,25.075254,25.069500,25.063490,25.058315,25.052198,25.046747,25.041706,25.037430,25.032719,25.027554,25.023008,25.019405,25.015335,25.011707,25.008356,25.004795,25.001245,24.998873,24.995180,24.993105,24.990345,24.988010,24.985689,24.983728,24.982573,24.979912]

 

value16=[27.805553,27.805452,27.805326,27.805141,27.804934,27.804707,27.804441,27.804146,27.803806,27.803448,27.803055,27.802616,27.802153,27.801638,27.801100,27.800528,27.799929,27.799288,27.798599,27.797873,27.797100,27.796282,27.795488,27.794607,27.793673,27.792713,27.791698,27.790634,27.789539,27.788370,27.787209,27.785978,27.784678,27.783331,27.781942,27.780560,27.779011,27.777456,27.775887,27.774204,27.772484,27.770658,27.768826,27.766857,27.764893,27.762827,27.760672,27.758480,27.756187,27.753736,27.751289,27.748739,27.746119,27.743340,27.740595,27.737606,27.734535,27.731436,27.728239,27.724876,27.721371,27.717766,27.714104,27.710150,27.706135,27.702023,27.697660,27.693205,27.688589,27.683794,27.678782,27.673717,27.668337,27.662867,27.657049,27.651077,27.644979,27.638554,27.631874,27.625154,27.617796,27.610493]

 

value17=[27.741627,27.741526,27.741430,27.741335,27.741240,27.741144,27.741047,27.740942,27.740849,27.740746,27.740654,27.740549,27.740459,27.740362,27.740263,27.740162,27.740067,27.739967,27.739864,27.739773,27.739668,27.739576,27.739473,27.739374,27.739269,27.739176,27.739079,27.738981,27.738878,27.738777,27.738678,27.738579,27.738485,27.738380,27.738287,27.738186,27.738083,27.737989,27.737888,27.737787,27.737686,27.737585,27.737484,27.737385,27.737284,27.737190,27.737082,27.736988,27.736885,27.736782,27.736683,27.736574,27.736475,27.736380,27.736279,27.736177,27.736074,27.735971,27.735861,27.735771,27.735672,27.735571,27.735468,27.735359,27.735271,27.735161,27.735054,27.734957,27.734850,27.734749,27.734650,27.734545,27.734444,27.734343,27.734234,27.734142,27.734035,27.733931,27.733818,27.733721,27.733624,27.733517,27.733404,27.733307,27.733213,27.733107,27.732994,27.732897,27.732788,27.732681,27.732582,27.732479,27.732374,27.732265,27.732162,27.732061,27.731955,27.731850,27.731741,27.731636,27.731535,27.731436,27.731329,27.731224,27.731119,27.731005,27.730906,27.730801,27.730698,27.730593,27.730478,27.730377,27.730274,27.730160,27.730066,27.729948,27.729836,27.729731,27.729633,27.729534,27.729422,27.729313,27.729210,27.729103,27.728996,27.728880,27.728775,27.728678,27.728563,27.728455,27.728348,27.728237,27.728142,27.728025,27.727913,27.727812,27.727701,27.727594,27.727488,27.727371,27.727274,27.727161,27.727055,27.726940,27.726831,27.726723,27.726612,27.726507,27.726398,27.726297,27.726179,27.726067,27.725965,27.725855,27.725735,27.725634,27.725525,27.725407,27.725302,27.725191,27.725086,27.724977,27.724861,27.724752,27.724649,27.724535,27.724421,27.724314,27.724194,27.724091,27.723978,27.723867,27.723759,27.723642,27.723534,27.723413,27.723310,27.723190,27.723085,27.722967,27.722862,27.722755,27.722641,27.722530,27.722410,27.722305,27.722191,27.722078,27.721968,27.721851,27.721741,27.721622,27.721510,27.721394,27.721287,27.721170,27.721050,27.720945,27.720833,27.720720,27.720600,27.720490,27.720381,27.720261,27.720152,27.720037,27.719931,27.719807,27.719694,27.719574,27.719473,27.719357,27.719240,27.719118,27.719000,27.718891,27.718775,27.718655,27.718542,27.718430,27.718309,27.718193,27.718088,27.717978,27.717848,27.717735,27.717621,27.717499,27.717388,27.717270,27.717159,27.717035,27.716923,27.716810,27.716690,27.716572,27.716455,27.716333,27.716219,27.716101,27.715988,27.715870,27.715754,27.715630,27.715517,27.715397,27.715273,27.715164,27.715042,27.714926,27.714808,27.714693,27.714569,27.714451,27.714329,27.714214,27.714090]

 

value18=[27.808060,27.807592,27.807171,27.806759,27.806385,27.806011,27.805668,27.805346,27.805035,27.804737,27.804459,27.804184,27.803921,27.803671,27.803436,27.803204,27.802975,27.802763,27.802547,27.802351,27.802162]



value19=[27.815674,27.815580,27.815458,27.815313,27.815134,27.814920,27.814680,27.814404,27.814108,27.813759,27.813385,27.812984,27.812550,27.812073,27.811579,27.811020,27.810455,27.809853,27.809212,27.808525,27.807814,27.807045,27.806265,27.805449,27.804579,27.803688,27.802717,27.801754,27.800722,27.799660,27.798473,27.797346,27.796160,27.794914,27.793612,27.792267,27.790869,27.789398,27.787933,27.786348,27.784729,27.783079,27.781324,27.779554,27.777735,27.775826,27.773851,27.771826,27.769732,27.767593,27.765318,27.762970,27.760544,27.758102,27.755507,27.752886,27.750124,27.747429,27.744448,27.741402,27.738344,27.735126,27.731758,27.728348,27.724821,27.721167,27.717426,27.713554,27.709547,27.705372,27.701048,27.696753,27.692207,27.687399,27.682600,27.677553,27.672310,27.666925,27.661453,27.655649,27.649775,27.643707,27.637388,27.630768,27.624130,27.617119,27.609900,27.602488,27.594814,27.586828,27.578779,27.570202,27.561293,27.552246,27.542978,27.533236,27.523180,27.513109,27.502363,27.491350,27.479912,27.468151,27.455774,27.443419,27.430531,27.417076,27.403357,27.388878,27.374006,27.358913,27.343189,27.326809,27.310118,27.292961,27.275179,27.256594,27.237774,27.218193,27.198172,27.177736,27.156309,27.134142,27.111687,27.088476,27.064096,27.039692,27.014496,26.988609,26.962032,26.934732,26.906715,26.877626,26.848309,26.818195,26.787462,26.755829,26.723305,26.690464,26.656706,26.622438,26.587366,26.551605,26.515207,26.477713,26.440269,26.401800,26.362864,26.323212,26.283791,26.243223,26.201895,26.160734,26.119158,26.076826,26.034014,25.990948,25.947506,25.904245,25.860611,25.816769,25.772591,25.729139,25.685003,25.640608,25.597073,25.552816,25.509539,25.465694,25.422379,25.379236,25.336237,25.293955,25.251799,25.210571,25.169172,25.128847,25.088524,25.048714,25.009949,24.971844,24.934141,24.897045,24.860491]

line_graph([value8,value9])

 

line_graph([value1,value2,value3,value4,value5,value6,value7,value8,value9,value10])

 

line_graph([value10])

 

line_graph([value11])

 

line_graph([value12])

 

line_graph([value13,value14])

 

line_graph([value13])

 

line_graph([value15])

 

line_graph([value15,value13])

 

line_graph([value16,value13])

 

line_graph([value15,value17])

 

line_graph([value15,value18])

 

line_graph([value16,value18])

 

line_graph([value15,value19])

 

line_graph([value16,value19])

Add comment

Comments

There are no comments yet.

Create Your Own Website With Webador