Search Results

Search found 59554 results on 2383 pages for 'distributed data mining'.

Page 689/2383 | < Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >

  • javascript exceed timeout

    - by user1866265
    I use jquery to develop mobile application, here is my code below the problem that when I add 5 or 6 line to the page contained all goes well. but if I add multiple line displays error message: Javascript execution exeeded timeout. function succes_recu_list_rubrique(tx, results) //apés avoire remplir sqlite { console.log('ENTRééééééééééééééé---') $('#lbtn').prepend("<legend>Selectionner un Rubrique</legend><br>"); for( var i=0; i<results.rows.length; i++ ) //Remplir tableau liste des identifiants étapes { $('#lbtn').append("<input name='opt1' checked type='radio' value="+results.rows.item(i).IdRubrique+" id="+results.rows.item(i).IdRubrique+" />"); $('#lbtn').append('<label for='+results.rows.item(i).IdRubrique+'>'+results.rows.item(i).LibelleRubrique+'</label>'); } $('#lbtn').append('<a href="#page_dialog2" class="offer2" data-rel="dialog" data-role="button" >Consulter</a>').trigger('create'); $('#lbtn').append('<a href="#'+id_grp_rub+'" data-role="button" data-rel="back" data-theme="c" >Cancel</a> ').trigger('create'); }

    Read the article

  • Can I use ext3 as a shared fs if I add a lock manager ?

    - by edomaur
    I need a cluster filesystem for an iSCSI device. The problem is that the servers to which it is connected generate datafiles which must be read by every other servers. Except for the writing and deleting of such files, I do not need a full locking scheme like in OCFS2 or GFS2. So, can I use a distributed lock manager (DLM) on top of an ext3 filesystem or must I use only specialized filesystem ?

    Read the article

  • lua userdata gc

    - by anon
    Is it possible for a piece of lua user data to hold reference to a lua object? (Like a table, or another piece of user data?). Basically, what I want to know is: Can I create a piece of userdata in such a way taht when the gc runs, the user data can say: "Hey! I'm holding references to these other objects, mark them as well." Thanks!

    Read the article

  • .getScript getting the redirect url of javascript

    - by user177883
    I d like to execute a remote javascript which redirects the user to another page on my domain with data that s passes as query string. I want to get this data which is passed on to the page on my domain. $.getScript('http://site.com/foo.js', function() { //foo.js redirects to another page on my domain with data // and i d like to capture that data from this function, // at least if i find the parameters that passed on there, i ll be fine. }); What to do ? http://api.jquery.com/jQuery.getScript/

    Read the article

  • Naive Bayes matlab, row classification

    - by Jungle Boogie
    How do you classify a row of seperate cells in matlab? Atm I can classify single coloums like so: training = [1;0;-1;-2;4;0;1]; % this is the sample data. target_class = ['posi';'zero';'negi';'negi';'posi';'zero';'posi']; % target_class are the different target classes for the training data; here 'positive' and 'negetive' are the two classes for the given training data % Training and Testing the classifier (between positive and negative) test = 10*randn(25, 1); % this is for testing. I am generating random numbers. class = classify(test,training, target_class, 'diaglinear') % This command classifies the test data depening on the given training data using a Naive Bayes classifier Unlike the above im looking at wanting to classify: A B C Row A | 1 | 1 | 1 = a house Row B | 1 | 2 | 1 = a garden Can anyone help? Here is a code example from matlabs site: nb = NaiveBayes.fit(training, class) nb = NaiveBayes.fit(..., 'param1',val1, 'param2',val2, ...) I dont understand what param1 is or what val1 etc should be?

    Read the article

  • Return nullable datetime from scalar, stored procedure

    - by molgan
    Hello I have a function that returns a date from a stored procedure, and it all works great til the value is NULL, how can I fix this so it works with null aswell? public DateTime? GetSomteDate(int SomeID) { DateTime? LimitDate= null; if (_entities.Connection.State == System.Data.ConnectionState.Closed) _entities.Connection.Open(); using (EntityCommand c = new EntityCommand("MyEntities.GetSomeDate", (EntityConnection)this._entities.Connection)) { c.CommandType = System.Data.CommandType.StoredProcedure; EntityParameter paramSomeID = new EntityParameter("SomeID", System.Data.DbType.Int32); paramSomeID.Direction = System.Data.ParameterDirection.Input; paramSomeID.Value = SomeID; c.Parameters.Add(paramSomeID); var x = c.ExecuteScalar(); if (x != null) LimitDate = (DateTime)x; return LimitDate.Value; }; }

    Read the article

  • Reading from a database located in the Program Files folder using ODBC

    - by Dabblernl
    We have an application that stores its database files in a subfolder of the Program Files directory. These files are redirected to the VirtualStore in Vista and Windows 7. We represent data from the database using Microsoft DataReports (VB6). So far so good. But we now want to use Crystal Reports XI to represent data from the database. Our idea is to NOT pass this data to CR from our program, but to have CR retreive it from the database using a a system DSN through ODBC. In this way we hope to present our users with more flexibility in designing their own reports. What we do want to ensure though is that these system DSNs are configured correctly when the user installs our program or when the program calls the Crystal Report. Is there a smart way to do this using System variables for instance, instead of having to write a routine that checks for OS-version, whether UAC is enabled on the OS, whether the write restrictions on the Program Files folder have been lifted, etc and then adapts he System DSN to point to either the C:\Program Files\OurApp\Data folder, or the C:\Users\User\AppData\VirtualStore\Program Files\OurApp\Data folder? Suggestions for an entirely different approach are welcome too!

    Read the article

  • populate textboxs with xml node attributes

    - by Doug
    I have Data.xml: <?xml version="1.0" encoding="utf-8" ?> <data> <album> <slide title="Autum Leaves" description="Leaves from the fall of 1986" source="images/Autumn Leaves.jpg" thumbnail="images/Autumn Leaves_thumb.jpg" /> <slide title="Creek" description="Creek in Alaska" source="images/Creek.jpg" thumbnail="images/Creek_thumb.jpg" /> </album> </data> I'd like to be able to edit the attributes of each Slide node via GridView (that has a "Select" column added.) And so far I have: protected void GridView1_SelectedIndexChanged(object sender, EventArgs e) { int selectedIndex = GridView1.SelectedIndex; LoadXmlData(selectedIndex); } private void LoadXmlData(int selectedIndex) { XmlDocument xmldoc = new XmlDocument(); xmldoc.Load(MapPath(@"..\photo_gallery\Data.xml")); XmlNodeList nodelist = xmldoc.DocumentElement.ChildNodes; XmlNode xmlnode = nodelist.Item(selectedIndex); titleTextBox.Text = xmlnode.Attributes["title"].InnerText; descriptionTextBox.Text = xmlnode.Attributes["description"].InnerText; sourceTextBox.Text = xmlnode.Attributes["source"].InnerText; thumbTextBox.Text = xmlnode.Attributes["thumbnail"].InnerText; The code for LoadXmlData is just a guess on my part - I'm new to working with xml in this way. I'd like have the user to slected the row from the gridview, then populate a set of text boxes with each slide attributed for updating back to the Data.xml file. The error I'm getting is "Object reference not set to an instance of an object" at the line: titleTextBox.Text = xmlnode.Attributes["@title"].InnerText; - so I'm not reaching the attribute "title" of the slide node. Thanks for any ideas you may have.

    Read the article

  • asynchronous sockets

    - by user158182
    i need to receive an acknowledgement from server when receives a data for confirming that data has received the server and how to use GetSocketOption() in asynchronous method

    Read the article

  • VBScript - image to binary

    - by countnazgul
    Hi to all, i'm not a VB programmer but i need a vbscript that convert image file (from local disk) to be converted to binary data and the passed to webservice. I realize how to pass data to webservice but i can't find how to convert the image file to binary data. I spend a lot of time to find some kind of solution but with no luck. Can somebody help me? Thanks!

    Read the article

  • Return Double from Boost thread

    - by Benedikt Wutzi
    Hi I have an Boost thread which should return a double. The function looks like this: void analyser::findup(const double startwl, const double max, double &myret){ this->data.begin(); for(int i = (int)data.size() ; i >= 0;i--){ if(this->data[i].lambda > startwl){ if(this->data[i].db >= (max-30)) { myret = this->data[i+1].lambda; std::cout <<"in thread " << myret << std::endl; return; } } } } this function is called by another function: void analyser::start_find_up(const double startwl, const double max){ double tmp = -42.0; boost::thread up(&analyser::findup,*this, startwl,max,tmp); std::cout << "before join " << tmp << std::endl; up.join(); std::cout << "after join " << tmp << std::endl; } Anyway I've tried and googled almost anything but i can't get it to return a value. The output looks like this right now. before join -42 in thread 843.487 after join -42 Thanks for any help.

    Read the article

  • php tree structure problem

    - by agazerboy
    Hi All, I have all my data in my database. It has following 4 columns id source_clust target_clust result_clust 1 7 72 649 2 9 572 650 3 649 454 651 4 32 650 435 This data is like tree structure. source_clust and target_clust generate target_clust. target_clust can be source_clust or target_clust to make a new target_clust. Is there any php function or class that I can use to generate tree structure for my data???? I see this MySql site they are doing exactly what I need but I couldn't find how to implement that query in my data. Thanks !

    Read the article

  • Using the stadard Java logging, is it possible to restart logs after a certain period?

    - by Fry
    I have some java code that will be running as an importer for data for a much larger project. The initial logging code was done with the java.util.logging classes, so I'd like to keep it if possible, but it seems to be a little inadequate now given he amount of data passing through the importer. Often times in the system, the importer will get data that the main system doesn't have information for or doesn't match the system's data so it is ignored but a message is written to the log about what information was dropped and why it wasn't imported. The problem is that this tends to grow in size very quickly, so we'd like to be able to start a fresh log daily or weekly. Does anybody have an idea if this can be done in the logging classes or would I have to switch to log4j or custom? Thanks for any help!

    Read the article

  • Django: How do I position a page when using Django templates

    - by swisstony
    I have a web page where the user enters some data and then clicks a submit button. I process the data and then use the same Django template to display the original data, the submit button, and the results. When I am using the Django template to display results, I would like the page to be automatically scrolled down to the part of the page where the results begin. This allows the user to scroll back up the page if she wants to change her original data and click submit again. Hopefully, there's some simple way of doing this that I can't see at the moment.

    Read the article

  • Group variables in a boxplot in R

    - by tao.hong
    I am trying to generate a boxplot whose data come from two scenarios. In the plot, I would like to group boxes by their names (So there will be two boxes per variable). I know ggplot would be a good choice. But I got errors which I could not figure out. Can anyone give me some suggestions? sensitivity_out1 structure(c(0.0522902104339716, 0.0521369824334004, 0.0520240345973737, 0.0519818337359876, 0.051935071418996, 0.0519089404325544, 0.000392698277338341, 0.000326135474295325, 0.000280863338343747, 0.000259631566041935, 0.000246594043996332, 0.000237923540393391, 0.00046732650331544, 0.000474448907808135, 0.000478287273678457, 0.000480194683464109, 0.000480631753078668, 0.000481760272726273, 0.000947965771207979, 0.000944821699830455, 0.000939631071343889, 0.000937186900570605, 0.000936007346568281, 0.000934756220144141, 0.00132442589501872, 0.00132658367774979, 0.00133334696220742, 0.00133622384928092, 0.0013381577476241, 0.00134005741746304, 0.0991622968751298, 0.100791399440082, 0.101946808417405, 0.102524244727408, 0.102920085260477, 0.103232984259916, 0.0305219507186844, 0.0304635269233494, 0.0304161055015213, 0.0303742106794513, 0.0303381888169022, 0.0302996157711171, 1.94268588634518e-05, 2.23991225564447e-05, 2.5756135487907e-05, 2.79997917298194e-05, 3.00753967077715e-05, 3.16270817369878e-05, 0.544701146678523, 0.542887331601984, 0.541632986366816, 0.541005610554556, 0.540617004208336, 0.540315690692195, 0.000453386694666078, 0.000448473414508756, 0.00044692043197248, 0.000444826296854332, 0.000445747996014684, 0.000444764303682453, 0.000127569551159321, 0.000128422491392669, 0.00012933662856487, 0.000129941842982939, 0.000129578971489026, 0.000131113075233758, 0.00684610571790029, 0.00686349387897349, 0.00687468164010565, 0.00687880720347743, 0.00688275579317197, 0.00687822247621936), .Dim = c(6L, 12L)) out2 structure(c(0.0189965816735366, 0.0189995096225103, 0.0190099362589894, 0.0190033523148514, 0.01900896721937, 0.0190099427513381, 0.00192043989797585, 0.00207303208721059, 0.00225931163225165, 0.0024049969048389, 0.00252310364086785, 0.00262940166568126, 0.00195164921633517, 0.00190079923515755, 0.00186139563778548, 0.00184188171395076, 0.00183248544676564, 0.00182492970673969, 1.83038731485927e-05, 1.98252671720347e-05, 2.14794764479231e-05, 2.30713122969332e-05, 2.4484220713564e-05, 2.55958833705284e-05, 0.0428066864455102, 0.0431686808647809, 0.0434411033615353, 0.0435883377765726, 0.0436690169266633, 0.0437340464360965, 0.145288252474567, 0.141488776430307, 0.138204532539654, 0.136281799717717, 0.134864952272761, 0.133738386148036, 0.0711728636959696, 0.072031388688795, 0.0727536853228245, 0.0731581966147734, 0.0734424337399303, 0.0736637270702609, 0.000605277151497094, 0.000617268349064968, 0.000632975679951382, 0.000643904422677427, 0.000653775268094148, 0.000662225067910141, 0.26735354610469, 0.267515415990146, 0.26753155165617, 0.267553498616325, 0.267532284594615, 0.267510330320289, 0.000334158771646756, 0.000319032383145857, 0.000306074699839994, 0.000299153278494114, 0.000293956197852583, 0.000290171804454218, 0.000645975219899115, 0.000637548672578787, 0.000632375486965757, 0.000629579821884212, 0.000624956458229123, 0.000622456283217054, 0.0645188290106884, 0.0651539609630352, 0.0656417364889907, 0.0658996698322889, 0.0660715073023965, 0.0662034341510152), .Dim = c(6L, 12L)) Melt data: group variable value 1 1 PLDKRT 0 2 1 PLDKRT 0 3 1 PLDKRT 0 4 1 PLDKRT 0 5 1 PLDKRT 0 6 1 PLDKRT 0 Code: #Data_source 1 sensitivity_1=rbind(sensitivity_out1,sensitivity_out2) sensitivity_1=data.frame(sensitivity_1) colnames(sensitivity_1)=main_l #variable names sensitivity_1$group=1 #Data_source 2 sensitivity_2=rbind(sensitivity_out1[3:4,],sensitivity_out2[3:4,]) sensitivity_2=data.frame(sensitivity_2) colnames(sensitivity_2)=main_l sensitivity_2$group=2 sensitivity_pool=rbind(sensitivity_1,sensitivity_2) sensitivity_pool_m=melt(sensitivity_pool,id.vars="group") ggplot(data = sensitivity_pool_m, aes(x = variable, y = value)) + geom_boxplot(aes( fill= group), width = 0.8) Error: "Error in unit(tic_pos.c, "mm") : 'x' and 'units' must have length > 0" Update Figure out the error. I should use geom_boxplot(aes( fill= factor(group)), width = 0.8) rather than fill= group

    Read the article

  • What MS technology to use for HTTP service returning XML?

    - by Borek
    I need to create a service that: accepts HTTP requests (with query string or HTTP POST parameters) does some processing on the requests (checking if the request is valid, authentication etc.) reads data from a custom store (another HTTP call in our case) returns the result as custom XML (defined with XSD) I'm trying to think of various MS technologies that could help me and how good they would be for this scenario (pretty standard one I guess). The tasks above are relatively separate, this is what comes to mind: HTTP front-end: ASP.NET Web Forms ASP.NET MVC (seems more appropriate here as I won't need server controls, view state etc.) WCF? Don't know much about it or how well it would suit my task. Custom logic on the server: this will probably be a generic C# code in all cases (sometimes "plugged into" or called from MVC controllers or some equivalent place in other technologies) Reading data from internal data stores: As said, this is another HTTP server in our case. Options that come to mind: Just read the data using something like WebClient (Just theoretically) implement a LINQ provider (Just even more theoretically) implement an EF provider Output the data as custom XML: Linq2XML Serialization? Is it flexible enough? Does WCF provide some tools for this? Some "OXM" - Object/XML mapper if there is something like that for .NET I may be wrong in many of my assumptions, this is just a quick list that comes to mind after a quick research. Some general notes / questions: Testing is important Solution with a clear domain model would be much preferred over the one without Can Entity Framework actually help somewhere in my scenario? If so, where and how? Would WCF be an appropriate technology for this? I don't know much about it.

    Read the article

  • Using java.util.logging, is it possible to restart logs after a certain period of time?

    - by Fry
    I have some java code that will be running as an importer for data for a much larger project. The initial logging code was done with the java.util.logging classes, so I'd like to keep it if possible, but it seems to be a little inadequate now given he amount of data passing through the importer. Often times in the system, the importer will get data that the main system doesn't have information for or doesn't match the system's data so it is ignored but a message is written to the log about what information was dropped and why it wasn't imported. The problem is that this tends to grow in size very quickly, so we'd like to be able to start a fresh log daily or weekly. Does anybody have an idea if this can be done in the logging classes or would I have to switch to log4j or custom? Thanks for any help!

    Read the article

  • modifying a cloned element and reinserting in dom - jquery

    - by Praveen Prasad
    //dom <div id='toBeCloned'><span>Some name</span></div> <div id='targetElm'></div> //js $(function () { //creating a clone var _clone = $('#toBeCloned').clone(true); // target element var _target = $('#targetElm'); //now target element is to be filled with cloned element with data of span changed var _someData = [1, 2, 3, 4]; //loop through data $.each(_someData, function (i, data) { var _newElm = {}; $.extend(_newElm, _clone);//copy cloned to new Elm _newElm.find('span').html(data); //edit content of span alert('p'); // alert added to show that append in next line inspite of adding new element to dom just updating the previous one _target.append(_newElm);//update target }); }); expected Result: 1 2 3 4 resut iam getting is 4

    Read the article

  • unable to load library at runtime in android application

    - by Addy
    Hi. I m working on android application in which I used JNI for native c code. I build this application on android 2.0 version and ndkr3. and its work fine. Now when I changed the android sdk version 1.5 and api version 3 I faced problem of unable to open library libtest_demo.so. 05-13 16:54:23.603: INFO/dalvikvm(1211): Unable to dlopen(/data/data/org.abc.test_demo/lib/libtest_demo.so): Cannot find library I put the libtest_demo.so file at the same place /data/data/org.abc.test_demo/lib/libtest_demo.so but still problem arise. please help me..

    Read the article

  • A smarter way to do this jQuery?

    - by Nicky Christensen
    I have a a map on my site, and clicking regions a class should be toggled, on both hover and click, I've made a jQuery solution to do this, however, I think it can be done a bit smarter than i've done it? my HTML output is this: <div class="mapdk"> <a data-class="nordjylland" class="nordjylland" href="#"><span>Nordjylland</span></a> <a data-class="midtjylland" class="midtjylland" href="#"><span>Midtjylland</span></a> <a data-class="syddanmark" class="syddanmark" href="#"><span>Syddanmark</span></a> <a data-class="sjaelland" class="sjalland" href="#"><span>Sjælland</span></a> <a data-class="hovedstaden" class="hovedstaden" href="#"><span>Hovedstaden</span></a> </div> And my jQuery looks like: if ($jq(".area .mapdk").length) { $jq(".mapdk a.nordjylland").hover(function () { $jq(".mapdk").toggleClass("nordjylland"); }).click(function () { $jq(".mapdk").toggleClass("nordjylland"); }); $jq(".mapdk a.midtjylland").hover(function () { $jq(".mapdk").toggleClass("midtjylland"); }).click(function () { $jq(".mapdk").toggleClass("midtjylland"); }); } The thing is, that with what i've done, i have to make a hover and click function for every link i've got - I was thinking I might could keep it in one hover,click function, and then use something like $jq(this) ? But not sure how ?

    Read the article

  • HDFS some datanodes of cluster are suddenly disconnected while reducers are running

    - by user1429825
    I have 8 slave computers and 1 master computer for running Hadoop (ver 0.21) some datanodes of cluster are suddenly disconnected while I was running MapReduce code on 10GB data After all mappers finished and around 80% of reducers was processed, randomly one or more datanode disconned from network. and then the other datanodes start to disappear from network even if I killed the MapReduce job when I found some datanode was disconnected. I've tried to change dfs.datanode.max.xcievers to 4096, turned off fire-walls of all computing node, disabled selinux and increased the number of file open limit to 20000 but they didn't work at all... anyone have a idea to solve this problem? followings are error log from mapreduce 12/06/01 12:31:29 INFO mapreduce.Job: Task Id : attempt_201206011227_0001_r_000006_0, Status : FAILED java.io.IOException: Bad connect ack with firstBadLink as ***.***.***.148:20010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:889) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427) and followings are logs from datanode 2012-06-01 13:01:01,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_-5549263231281364844_3453 src: /*.*.*.147:56205 dest: /*.*.*.142:20010 2012-06-01 13:01:01,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020) Starting thread to transfer block blk_-3849519151985279385_5906 to *.*.*.147:20010 2012-06-01 13:01:19,135 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020):Failed to transfer blk_-5797481564121417802_3453 to *.*.*.146:20010 got java.net.ConnectException: > Connection timed out at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:373) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1257) at java.lang.Thread.run(Thread.java:722) 2012-06-01 13:06:20,342 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification succeeded for blk_6674438989226364081_3453 2012-06-01 13:09:01,781 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020):Failed to transfer blk_-3849519151985279385_5906 to *.*.*.147:20010 got java.net.SocketTimeoutException: 480000 millis timeout while waiting for channel to be ready for write. ch : java.nio.channels.SocketChannel[connected local=/*.*.*.142:60057 remote=/*.*.*.147:20010] at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:246) at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:164) at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:203) at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:388) at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:476) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1284) at java.lang.Thread.run(Thread.java:722) hdfs-site.xml <configuration> <property> <name>dfs.name.dir</name> <value>/home/hadoop/data/name</value> </property> <property> <name>dfs.data.dir</name> <value>/home/hadoop/data/hdfs1,/home/hadoop/data/hdfs2,/home/hadoop/data/hdfs3,/home/hadoop/data/hdfs4,/home/hadoop/data/hdfs5</value> </property> <property> <name>dfs.replication</name> <value>3</value> </property> <property> <name>dfs.datanode.max.xcievers</name> <value>4096</value> </property> <property> <name>dfs.http.address</name> <value>0.0.0.0:20070</value> <description>50070 The address and the base port where the dfs namenode web ui will listen on. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.http.address</name> <value>0.0.0.0:20075</value> <description>50075 The datanode http server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.secondary.http.address</name> <value>0.0.0.0:20090</value> <description>50090 The secondary namenode http server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.address</name> <value>0.0.0.0:20010</value> <description>50010 The address where the datanode server will listen to. If the port is 0 then the server will start on a free port. </description> <property> <name>dfs.datanode.ipc.address</name> <value>0.0.0.0:20020</value> <description>50020 The datanode ipc server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.https.address</name> <value>0.0.0.0:20475</value> </property> <property> <name>dfs.https.address</name> <value>0.0.0.0:20470</value> </property> </configuration> mapred-site.xml <configuration> <property> <name>mapred.job.tracker</name> <value>masternode:29001</value> </property> <property> <name>mapred.system.dir</name> <value>/home/hadoop/data/mapreduce/system</value> </property> <property> <name>mapred.local.dir</name> <value>/home/hadoop/data/mapreduce/local</value> </property> <property> <name>mapred.map.tasks</name> <value>32</value> <description> default number of map tasks per job.</description> </property> <property> <name>mapred.tasktracker.map.tasks.maximum</name> <value>4</value> </property> <property> <name>mapred.reduce.tasks</name> <value>8</value> <description> default number of reduce tasks per job.</description> </property> <property> <name>mapred.map.child.java.opts</name> <value>-Xmx2048M</value> </property> <property> <name>io.sort.mb</name> <value>500</value> </property> <property> <name>mapred.task.timeout</name> <value>1800000</value> <!-- 30 minutes --> </property> <property> <name>mapred.job.tracker.http.address</name> <value>0.0.0.0:20030</value> <description> 50030 The job tracker http server address and port the server will listen on. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>mapred.task.tracker.http.address</name> <value>0.0.0.0:20060</value> <description> 50060 </property> </configuration>

    Read the article

  • Parsing the json and storing it in an array?

    - by Prateek Raj
    hi everyone, i'm very new to this web related problems, please help i'm working on sencha which proves to be very difficult wen it comes to json parsing . . . . so i'm planning on retrieving the data to the html page and then load it into my js file. . . so here is the problem: i've already asked about it and got a reply.. http://jsbin.com/uwuca5 but now wen i use the html source code locally in my system or even by using the IIS i couldn't parse the data. . . . . . . here is the link for my json file: http://compliantbox.com/optionsedge/sample.php i'm trying to use this link in my code and retrive the data but the data is returning null Please Help Thank you,

    Read the article

  • Which to use, XMP or RDF?

    - by zotty
    What's the difference between RDF and XMP? From what I can tell, XMP is derived from RDF... so what does it offer that RDF doesn't? My particular situation is this: I've got some images which need tagging with details of how an experiment was performed, and what sort of data analysis has been performed on the images. A colleague of mine is pushing for XMP, but he's thinking of the images as photos - they're not really, they're just bits of data. From what I've seen (mainly by opening images in notepad++) the XMP data looks very similar to RDF - even so far as using RDF in the tag names (e.g. <rdf:Seq>). I'd like this data to be usable by other people who use similar instruments for similar experiments, so creating a mini standard (schema?) seems like the way to go. Apologies for the lack of fundemental understanding - I'm a Doctor, not a programmer! If it makes any difference, the language of choice will be C#. Edit for more information: First off, thanks for the excellent replies - thinking of XMP as a vocabulary for RDF makes things a lot clearer. The sort of data I'll be storing wont be avaliable in any of the pre-defined sets. It'll detail experimental set ups, locations and results. I think using RDF is the way to go.

    Read the article

< Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >