Search Results

Search found 968 results on 39 pages for 'lambda'.

Page 8/39 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Delphi: EInvalidOp in neural network class (TD-lambda)

    - by user89818
    I have the following draft for a neural network class. This neural network should learn with TD-lambda. It is started by calling the getRating() function. But unfortunately, there is an EInvalidOp (invalid floading point operation) error after about 1000 iterations in the following lines: neuronsHidden[j] := neuronsHidden[j]+neuronsInput[t][i]*weightsInput[i][j]; // input -> hidden weightsHidden[j][k] := weightsHidden[j][k]+LEARNING_RATE_HIDDEN*tdError[k]*eligibilityTraceOutput[j][k]; // adjust hidden->output weights according to TD-lambda Why is this error? I can't find the mistake in my code :( Can you help me? Thank you very much in advance! unit uNeuronalesNetz; interface uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, ExtCtrls, StdCtrls, Grids, Menus, Math; const NEURONS_INPUT = 43; // number of neurons in the input layer NEURONS_HIDDEN = 60; // number of neurons in the hidden layer NEURONS_OUTPUT = 1; // number of neurons in the output layer NEURONS_TOTAL = NEURONS_INPUT+NEURONS_HIDDEN+NEURONS_OUTPUT; // total number of neurons in the network MAX_TIMESTEPS = 42; // maximum number of timesteps possible (after 42 moves: board is full) LEARNING_RATE_INPUT = 0.25; // in ideal case: decrease gradually in course of training LEARNING_RATE_HIDDEN = 0.15; // in ideal case: decrease gradually in course of training GAMMA = 0.9; LAMBDA = 0.7; // decay parameter for eligibility traces type TFeatureVector = Array[1..43] of SmallInt; // definition of the array type TFeatureVector TArtificialNeuralNetwork = class // definition of the class TArtificialNeuralNetwork private // GENERAL SETTINGS START learningMode: Boolean; // does the network learn and change its weights? // GENERAL SETTINGS END // NETWORK CONFIGURATION START neuronsInput: Array[1..MAX_TIMESTEPS] of Array[1..NEURONS_INPUT] of Extended; // array of all input neurons (their values) for every timestep neuronsHidden: Array[1..NEURONS_HIDDEN] of Extended; // array of all hidden neurons (their values) neuronsOutput: Array[1..NEURONS_OUTPUT] of Extended; // array of output neurons (their values) weightsInput: Array[1..NEURONS_INPUT] of Array[1..NEURONS_HIDDEN] of Extended; // array of weights: input->hidden weightsHidden: Array[1..NEURONS_HIDDEN] of Array[1..NEURONS_OUTPUT] of Extended; // array of weights: hidden->output // NETWORK CONFIGURATION END // LEARNING SETTINGS START outputBefore: Array[1..NEURONS_OUTPUT] of Extended; // the network's output value in the last timestep (the one before) eligibilityTraceHidden: Array[1..NEURONS_INPUT] of Array[1..NEURONS_HIDDEN] of Array[1..NEURONS_OUTPUT] of Extended; // array of eligibility traces: hidden layer eligibilityTraceOutput: Array[1..NEURONS_TOTAL] of Array[1..NEURONS_TOTAL] of Extended; // array of eligibility traces: output layer reward: Array[1..MAX_TIMESTEPS] of Array[1..NEURONS_OUTPUT] of Extended; // the reward value for all output neurons in every timestep tdError: Array[1..NEURONS_OUTPUT] of Extended; // the network's error value for every single output neuron t: Byte; // current timestep cyclesTrained: Integer; // number of cycles trained so far (learning rates could be decreased accordingly) last50errors: Array[1..50] of Extended; // LEARNING SETTINGS END public constructor Create; // create the network object and do the initialization procedure UpdateEligibilityTraces; // update the eligibility traces for the hidden and output layer procedure tdLearning; // learning algorithm: adjust the network's weights procedure ForwardPropagation; // propagate the input values through the network to the output layer function getRating(state: TFeatureVector; explorative: Boolean): Extended; // get the rating for a given state (feature vector) function HyperbolicTangent(x: Extended): Extended; // calculate the hyperbolic tangent [-1;1] procedure StartNewCycle; // start a new cycle with everything set to default except for the weights procedure setLearningMode(activated: Boolean=TRUE); // switch the learning mode on/off procedure setInputs(state: TFeatureVector); // transfer the given feature vector to the input layer (set input neurons' values) procedure setReward(currentReward: SmallInt); // set the reward for the current timestep (with learning then or without) procedure nextTimeStep; // increase timestep t function getCyclesTrained(): Integer; // get the number of cycles trained so far procedure Visualize(imgHidden: Pointer); // visualize the neural network's hidden layer end; implementation procedure TArtificialNeuralNetwork.UpdateEligibilityTraces; var i, j, k: Integer; begin // how worthy is a weight to be adjusted? for j := 1 to NEURONS_HIDDEN do begin for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceOutput[j][k] := LAMBDA*eligibilityTraceOutput[j][k]+(neuronsOutput[k]*(1-neuronsOutput[k]))*neuronsHidden[j]; for i := 1 to NEURONS_INPUT do begin eligibilityTraceHidden[i][j][k] := LAMBDA*eligibilityTraceHidden[i][j][k]+(neuronsOutput[k]*(1-neuronsOutput[k]))*weightsHidden[j][k]*neuronsHidden[j]*(1-neuronsHidden[j])*neuronsInput[t][i]; end; end; end; end; procedure TArtificialNeuralNetwork.setReward; VAR i: Integer; begin for i := 1 to NEURONS_OUTPUT do begin // +1 = player A wins // 0 = draw // -1 = player B wins reward[t][i] := currentReward; end; end; procedure TArtificialNeuralNetwork.tdLearning; var i, j, k: Integer; begin if learningMode then begin for k := 1 to NEURONS_OUTPUT do begin if reward[t][k] = 0 then begin tdError[k] := GAMMA*neuronsOutput[k]-outputBefore[k]; // network's error value when reward is 0 end else begin tdError[k] := reward[t][k]-outputBefore[k]; // network's error value in the final state (reward received) end; for j := 1 to NEURONS_HIDDEN do begin weightsHidden[j][k] := weightsHidden[j][k]+LEARNING_RATE_HIDDEN*tdError[k]*eligibilityTraceOutput[j][k]; // adjust hidden->output weights according to TD-lambda for i := 1 to NEURONS_INPUT do begin weightsInput[i][j] := weightsInput[i][j]+LEARNING_RATE_INPUT*tdError[k]*eligibilityTraceHidden[i][j][k]; // adjust input->hidden weights according to TD-lambda end; end; end; end; end; procedure TArtificialNeuralNetwork.ForwardPropagation; var i, j, k: Integer; begin for j := 1 to NEURONS_HIDDEN do begin neuronsHidden[j] := 0; for i := 1 to NEURONS_INPUT do begin neuronsHidden[j] := neuronsHidden[j]+neuronsInput[t][i]*weightsInput[i][j]; // input -> hidden end; neuronsHidden[j] := HyperbolicTangent(neuronsHidden[j]); // activation of hidden neuron j end; for k := 1 to NEURONS_OUTPUT do begin neuronsOutput[k] := 0; for j := 1 to NEURONS_HIDDEN do begin neuronsOutput[k] := neuronsOutput[k]+neuronsHidden[j]*weightsHidden[j][k]; // hidden -> output end; neuronsOutput[k] := HyperbolicTangent(neuronsOutput[k]); // activation of output neuron k end; end; procedure TArtificialNeuralNetwork.setLearningMode; begin learningMode := activated; end; constructor TArtificialNeuralNetwork.Create; var i, j, k: Integer; begin inherited Create; Randomize; // initialize random numbers generator learningMode := TRUE; cyclesTrained := -2; // only set to -2 because it will be increased twice in the beginning StartNewCycle; for j := 1 to NEURONS_HIDDEN do begin for k := 1 to NEURONS_OUTPUT do begin weightsHidden[j][k] := abs(Random-0.5); // initialize weights: 0 <= random < 0.5 end; for i := 1 to NEURONS_INPUT do begin weightsInput[i][j] := abs(Random-0.5); // initialize weights: 0 <= random < 0.5 end; end; for i := 1 to 50 do begin last50errors[i] := 0; end; end; procedure TArtificialNeuralNetwork.nextTimeStep; begin t := t+1; end; procedure TArtificialNeuralNetwork.StartNewCycle; var i, j, k, m: Integer; begin t := 1; // start in timestep 1 cyclesTrained := cyclesTrained+1; // increase the number of cycles trained so far for j := 1 to NEURONS_HIDDEN do begin neuronsHidden[j] := 0; for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceOutput[j][k] := 0; outputBefore[k] := 0; neuronsOutput[k] := 0; for m := 1 to MAX_TIMESTEPS do begin reward[m][k] := 0; end; end; for i := 1 to NEURONS_INPUT do begin for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceHidden[i][j][k] := 0; end; end; end; end; function TArtificialNeuralNetwork.getCyclesTrained; begin result := cyclesTrained; end; procedure TArtificialNeuralNetwork.setInputs; var k: Integer; begin for k := 1 to NEURONS_INPUT do begin neuronsInput[t][k] := state[k]; end; end; function TArtificialNeuralNetwork.getRating; begin setInputs(state); ForwardPropagation; result := neuronsOutput[1]; if not explorative then begin tdLearning; // adjust the weights according to TD-lambda ForwardPropagation; // calculate the network's output again outputBefore[1] := neuronsOutput[1]; // set outputBefore which will then be used in the next timestep UpdateEligibilityTraces; // update the eligibility traces for the next timestep nextTimeStep; // go to the next timestep end; end; function TArtificialNeuralNetwork.HyperbolicTangent; begin if x > 5500 then // prevent overflow result := 1 else result := (Exp(2*x)-1)/(Exp(2*x)+1); end; end.

    Read the article

  • Good Lambda Probe Alternative

    - by joel_gil
    hello everyone i was just wondering of a good Lambda Probe alternative for monitoring MySQL server with GLassfish The cath is that we dont want just a current monitor but sth that will keep a history since i am going to stress test some servers and i would like sth that outputs the data either in tables or graphic so i can then run some statistical analysis I hope my question was understood, thanks in advanced

    Read the article

  • LINQ Query vs Lambda Expression

    - by FosterZ
    What is the difference between the following two snippets (i.e LINQ Query vs Lambda Expression) LINQ Query public Product GetProduct(int productID) { AdventureWorksDBDataContext db = new AdventureWorksDBDataContext(); Product product = (from p in db.Products where p.ProductID == productID select p).Single(); return product; } Using a Lambda expression public Product GetProduct(int productID) { AdventureWorksDBDataContext db = new AdventureWorksDBDataContext(); Product product = db.Products.Single(p = p.ProductID == productID); return product; }

    Read the article

  • Does lamda in List.ForEach leads to memory leaks and performance problems ?

    - by Monomachus
    I have a problem which I could solve using something like this sortedElements.ForEach((XElement el) => PrintXElementName(el,i++)); And this means that I have in ForEach a lambda which permits using parameters like int i. I like that way of doing it, but i read somewhere that anonymous methods and delegates with lambda leads to a lot of memory leaks because each time when lambda is executed something is instantiated but is not released. Something like that. Could you please tell me if this is true in this situation and if it is why?

    Read the article

  • How to create and Expression tree to do the same as "StartsWith"

    - by Jonathan
    Hi to all. Currently, I have this method to compare two numbers Private Function ETForGreaterThan(ByVal query As IQueryable(Of T), ByVal propertyValue As Object, ByVal propertyInfo As PropertyInfo) As IQueryable(Of T) Dim e As ParameterExpression = Expression.Parameter(GetType(T), "e") Dim m As MemberExpression = Expression.MakeMemberAccess(e, propertyInfo) Dim c As ConstantExpression = Expression.Constant(propertyValue, propertyValue.GetType()) Dim b As BinaryExpression = Expression.GreaterThan(m, c) Dim lambda As Expression(Of Func(Of T, Boolean)) = Expression.Lambda(Of Func(Of T, Boolean))(b, e) Return query.Where(lambda) End Function It works fine and is consumed in this way query = ETForGreaterThan(query, Value, propertyInfo) As you can see, I give it an IQueryable collection and it add a where clause to it, base on a property and a value. Y can construct Lessthan, LessOrEqualThan etc equivalents as Expression has this operators predefined. ¿How can I transform this code to do the same with strings? Expression don't give me a predefined operator like "contains" or "startwith" and I'm really noob with Expression trees. Thanks, and please Post your answer in C#/VB. Choose the one that make you feel more confortable.

    Read the article

  • Lambda expressions in C#?

    - by user30457
    Hey I'm rather new to these could someone explain the significance or prehaps give a link to some useful information on lambda expressions? I encounter the following code in a test and I am wondering why someone would do this: foo.MyEvent += (o, e) = { fCount++; Console.WriteLine(fCount); }; foo.MyEvent -= (o, e) = { fCount++; Console.WriteLine(fCount); }; My instinct tells me it is something simple and not a mistake, but I don't know enough about these expressions to understand why this is being done. Regards,

    Read the article

  • How to write a code Newton Raphson code in R involving integration and Bessel function

    - by Ahmed
    I have want to estimate the parameters of the function which involves Bessel function and integration. However, when i tried to run it, i got a message that "Error in f(x, ...) : could not find function "BesselI" ". I don't know to fix it and would appreciate any related proposal. library(Bessel) library(maxLik) library(miscTools) K<-300 f <- function(theta,lambda,u) {exp(-u*theta)*BesselI(2*sqrt(t*u*theta*lambda),1)/u^0.5} F <- function(theta,lambda){integrate(f,0,K,theta=theta,lambda=lambda)$value} tt<-function(theta,lambda){(sqrt(lambda)*exp(-t*lambda)/(2*sqrt(t*theta)))(theta(2*t*lambda-1)*F(theta,lambda)} loglik <- function(param) { theta <- param[1] lambda <- param[2] ll <-sum(log(tt(theta,lambda))) } t<-c(24,220,340,620,550,559,689,543) res <- maxNR(loglik, start=c(0.001,0.0005),print.level=1,tol = 1e-08) summary(res)

    Read the article

  • Using System.DateTime in a C# Lambda expression gives an exception

    - by Samantha J
    I tried to implement a suggestion that came up in another question: Stackoverflow question Snippet here: public static class StatusExtensions { public static IHtmlString StatusBox<TModel>( this HtmlHelper<TModel> helper, Expression<Func<TModel, RowInfo>> ex ) { var createdEx = Expression.Lambda<Func<TModel, DateTime>>( Expression.Property(ex.Body, "Created"), ex.Parameters ); var modifiedEx = Expression.Lambda<Func<TModel, DateTime>>( Expression.Property(ex.Body, "Modified"), ex.Parameters ); var a = "a" + helper.HiddenFor(createdEx) + helper.HiddenFor(modifiedEx); return new HtmlString( "Some things here ..." + helper.HiddenFor(createdEx) + helper.HiddenFor(modifiedEx) ); } } When implemented I am getting the following exception which I don't really understand. The exception points to the line starting with "var createdEx =" System.ArgumentException was unhandled by user code Message=Expression of type 'System.Nullable`1[System.DateTime]' cannot be used for return type 'System.DateTime' Source=System.Core StackTrace: Can anyone help me out and suggest what I could do to resolve the exception?

    Read the article

  • Summation using Lambda C# 3.0

    - by Newbie
    I have a small thing. public int GetSum(List<int> x) { foreach (int i in x) sum += i; return sum; } where x is a List of integers defined as List<int> l = new List<int>(); l.Add(4); l.Add(2); l.Add(5); l.Add(8); l.Add(6); and has been passed as GetSum(l) Is it possible to rewrite the above foreach loop using a lambda? Reason: I started looking into the lambda stuffs since yesterday and is interested to learn .. however simple it may be. Thanks.

    Read the article

  • "Cannot use fixed local inside lambda expression"

    - by JulianR
    I have an XNA 3.0 project that compiled just fine in VS2008, but that gives compile errors in VS2010 (with XNA 4.0 CTP). The error: Cannot use fixed local 'depthPtr' inside an anonymous method, lambda expression, or query expression depthPtr is a fixed float* into an array, that is used inside a Parallel.For lambda expression from System.Threading. As I said, this compiled and ran just fine on VS2008, but it does not on VS2010, even when targeting .NET 3.5. Has this changed in .NET 4.0, and even so, shouldn't it still compile when I choose .NET 3.5 as the target framework? Searching for the term "Cannot use fixed local" yields exactly one (useless) result, both in Google and Bing. If this has changed, what is the reason for this? I can imagine capturing a fixed pointer-type in a closure could get a bit weird, is that why? So I'm guessing this is bad practice? And before anyone asks: no, the use of pointers is not absolutely critical here. I would still like to know though :)

    Read the article

  • Change the for loop with lambda(C#3.0)

    - by Newbie
    Is it possible to do the same using Lambda for (int i = 0; i < objEntityCode.Count; i++) { options.Attributes[i] = new EntityCodeKey(); options.Attributes[i].EntityCode = objEntityCode[i].EntityCodes; options.Attributes[i].OrganizationCode = Constants.ORGANIZATION_CODE; } I mean to say to rewrite the statement using lambda. I tried with Enumerable.Range(0,objEntityCode.Count-1).Foreach(i=> { options.Attributes[i] = new EntityCodeKey(); options.Attributes[i].EntityCode = objEntityCode[i].EntityCodes; options.Attributes[i].OrganizationCode = Constants.ORGANIZATION_CODE; }); but not working I am using C#3.0

    Read the article

  • Creating a property setter delegate

    - by Jim C
    I have created methods for converting a property lambda to a delegate: public static Delegate MakeGetter<T>(Expression<Func<T>> propertyLambda) { var result = Expression.Lambda(propertyLambda.Body).Compile(); return result; } public static Delegate MakeSetter<T>(Expression<Action<T>> propertyLambda) { var result = Expression.Lambda(propertyLambda.Body).Compile(); return result; } These work: Delegate getter = MakeGetter(() => SomeClass.SomeProperty); object o = getter.DynamicInvoke(); Delegate getter = MakeGetter(() => someObject.SomeProperty); object o = getter.DynamicInvoke(); but these won't compile: Delegate setter = MakeSetter(() => SomeClass.SomeProperty); setter.DynamicInvoke(new object[]{propValue}); Delegate setter = MakeSetter(() => someObject.SomeProperty); setter.DynamicInvoke(new object[]{propValue}); The MakeSetter lines fail with "The type arguments cannot be inferred from the usage. Try specifying the type arguments explicitly." Is what I'm trying to do possible? Thanks in advance.

    Read the article

  • IComparer using Lambda Expression

    - by josephj1989
    class p { public string Name { get; set; } public int Age { get; set; } }; static List<p> ll = new List<p> { new p{Name="Jabc",Age=53},new p{Name="Mdef",Age=20}, new p{Name="Exab",Age=45},new p{Name="G123",Age=19} }; protected static void SortList() { IComparer<p> mycomp = (x, y) => x.Name.CompareTo(y.Name); <==(Line 1) ll.Sort((x, y) => x.Name.CompareTo(y.Name));<==(Line 2) } Here the List.sort expects an IComparer<p> as a parameter. And it works with the lambda as shown in Line 2. But when I try to do as in Line 1, I get an error: Cannot convert lambda expression to type System.Collections.Generic.IComparer' because it is not a delegate type I investigated this for quite some time but I still don't understand it. Maybe my understanding of IComparer is not quite good.

    Read the article

  • Can this be done using LINQ/Lambda, C#3.0

    - by Newbie
    Objective: Generate dates based on Week Numbers Input: StartDate, WeekNumber Output: List of dates from the Week number specified till the StartDate i.e. If startdate is 23rd April, 2010 and the week number is 1, then the program should return the dates from 16th April, 2010 till the startddate. The function public List<DateTime> GetDates(DateTime startDate,int weeks) { List<DateTime> dt = new List<DateTime>(); int days = weeks * 7; DateTime endDate = startDate.AddDays(-days); TimeSpan ts = startDate.Subtract(endDate); for (int i = 0; i <= ts.Days; i++) { DateTime dt1 = endDate.AddDays(i); dt.Add(dt1); } return dt; } I am calling this function as DateTime StartDate = DateTime.ParseExact("20100423", "yyyyMMdd", System.Globalization.CultureInfo.InvariantCulture); List<DateTime> dtList = GetDates(StartDate, 1); The program is working fine. Question is using C# 3.0 feature like Linq, Lambda etc. can I rewrite the program. Why? Because I am learning linq and lambda and want to implement the same. But as of now the knowledge is not sufficient to do the same by myself. Thanks.

    Read the article

  • Will C++1x support __stdcall or extern "C" capture-nothing lambdas?

    - by Daniel Trebbien
    Yesterday I was thinking about whether it would be possible to use the convenience of C++1x lambda functions to write callbacks for Windows API functions. For example, what if I wanted to use a lambda as an EnumChildProc with EnumChildWindows? Something like: EnumChildWindows(hTrayWnd, CALLBACK [](HWND hWnd, LPARAM lParam) { // ... return static_cast<BOOL>(TRUE); // continue enumerating }, reinterpret_cast<LPARAM>(&myData)); Another use would be to write extern "C" callbacks for C routines. E.g.: my_class *pRes = static_cast<my_class*>(bsearch(&key, myClassObjectsArr, myClassObjectsArr_size, sizeof(my_class), extern "C" [](const void *pV1, const void *pV2) { const my_class& o1 = *static_cast<const my_class*>(pV1); const my_class& o2 = *static_cast<const my_class*>(pV2); int res; // ... return res; })); Is this possible?

    Read the article

  • Strategy for desugaring Haskell

    - by luqui
    I'm developing a virtual machine for purely functional programs, and I would like to be able to test and use the the wide variety of Haskell modules already available. The VM takes as input essentially terms in the untyped lambda calculus. I'm wondering what would be a good way to extract such a representation from modern Haskell modules (eg. with MPTC's, pattern guards, etc.). I did a little research and there doesn't seem to be a tool that does this already (I would be delighted to be mistaken), and that's okay. I'm looking for an approach. GHC Core seems too operationally focused, especially since one of the things the VM does is to change the evaluation order significantly. Are there any accessible intermediate representations that correspond more closely to the lambda calculus?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >