Search Results

Search found 58956 results on 2359 pages for 'data structures'.

Page 681/2359 | < Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >

  • Using java.util.logging, is it possible to restart logs after a certain period of time?

    - by Fry
    I have some java code that will be running as an importer for data for a much larger project. The initial logging code was done with the java.util.logging classes, so I'd like to keep it if possible, but it seems to be a little inadequate now given he amount of data passing through the importer. Often times in the system, the importer will get data that the main system doesn't have information for or doesn't match the system's data so it is ignored but a message is written to the log about what information was dropped and why it wasn't imported. The problem is that this tends to grow in size very quickly, so we'd like to be able to start a fresh log daily or weekly. Does anybody have an idea if this can be done in the logging classes or would I have to switch to log4j or custom? Thanks for any help!

    Read the article

  • HDFS some datanodes of cluster are suddenly disconnected while reducers are running

    - by user1429825
    I have 8 slave computers and 1 master computer for running Hadoop (ver 0.21) some datanodes of cluster are suddenly disconnected while I was running MapReduce code on 10GB data After all mappers finished and around 80% of reducers was processed, randomly one or more datanode disconned from network. and then the other datanodes start to disappear from network even if I killed the MapReduce job when I found some datanode was disconnected. I've tried to change dfs.datanode.max.xcievers to 4096, turned off fire-walls of all computing node, disabled selinux and increased the number of file open limit to 20000 but they didn't work at all... anyone have a idea to solve this problem? followings are error log from mapreduce 12/06/01 12:31:29 INFO mapreduce.Job: Task Id : attempt_201206011227_0001_r_000006_0, Status : FAILED java.io.IOException: Bad connect ack with firstBadLink as ***.***.***.148:20010 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:889) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:820) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:427) and followings are logs from datanode 2012-06-01 13:01:01,118 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_-5549263231281364844_3453 src: /*.*.*.147:56205 dest: /*.*.*.142:20010 2012-06-01 13:01:01,136 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020) Starting thread to transfer block blk_-3849519151985279385_5906 to *.*.*.147:20010 2012-06-01 13:01:19,135 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020):Failed to transfer blk_-5797481564121417802_3453 to *.*.*.146:20010 got java.net.ConnectException: > Connection timed out at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:373) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1257) at java.lang.Thread.run(Thread.java:722) 2012-06-01 13:06:20,342 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification succeeded for blk_6674438989226364081_3453 2012-06-01 13:09:01,781 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(*.*.*.142:20010, storageID=DS-1534489105-*.*.*.142-20010-1337757934836, infoPort=20075, ipcPort=20020):Failed to transfer blk_-3849519151985279385_5906 to *.*.*.147:20010 got java.net.SocketTimeoutException: 480000 millis timeout while waiting for channel to be ready for write. ch : java.nio.channels.SocketChannel[connected local=/*.*.*.142:60057 remote=/*.*.*.147:20010] at org.apache.hadoop.net.SocketIOWithTimeout.waitForIO(SocketIOWithTimeout.java:246) at org.apache.hadoop.net.SocketOutputStream.waitForWritable(SocketOutputStream.java:164) at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:203) at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendChunks(BlockSender.java:388) at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:476) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1284) at java.lang.Thread.run(Thread.java:722) hdfs-site.xml <configuration> <property> <name>dfs.name.dir</name> <value>/home/hadoop/data/name</value> </property> <property> <name>dfs.data.dir</name> <value>/home/hadoop/data/hdfs1,/home/hadoop/data/hdfs2,/home/hadoop/data/hdfs3,/home/hadoop/data/hdfs4,/home/hadoop/data/hdfs5</value> </property> <property> <name>dfs.replication</name> <value>3</value> </property> <property> <name>dfs.datanode.max.xcievers</name> <value>4096</value> </property> <property> <name>dfs.http.address</name> <value>0.0.0.0:20070</value> <description>50070 The address and the base port where the dfs namenode web ui will listen on. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.http.address</name> <value>0.0.0.0:20075</value> <description>50075 The datanode http server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.secondary.http.address</name> <value>0.0.0.0:20090</value> <description>50090 The secondary namenode http server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.address</name> <value>0.0.0.0:20010</value> <description>50010 The address where the datanode server will listen to. If the port is 0 then the server will start on a free port. </description> <property> <name>dfs.datanode.ipc.address</name> <value>0.0.0.0:20020</value> <description>50020 The datanode ipc server address and port. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>dfs.datanode.https.address</name> <value>0.0.0.0:20475</value> </property> <property> <name>dfs.https.address</name> <value>0.0.0.0:20470</value> </property> </configuration> mapred-site.xml <configuration> <property> <name>mapred.job.tracker</name> <value>masternode:29001</value> </property> <property> <name>mapred.system.dir</name> <value>/home/hadoop/data/mapreduce/system</value> </property> <property> <name>mapred.local.dir</name> <value>/home/hadoop/data/mapreduce/local</value> </property> <property> <name>mapred.map.tasks</name> <value>32</value> <description> default number of map tasks per job.</description> </property> <property> <name>mapred.tasktracker.map.tasks.maximum</name> <value>4</value> </property> <property> <name>mapred.reduce.tasks</name> <value>8</value> <description> default number of reduce tasks per job.</description> </property> <property> <name>mapred.map.child.java.opts</name> <value>-Xmx2048M</value> </property> <property> <name>io.sort.mb</name> <value>500</value> </property> <property> <name>mapred.task.timeout</name> <value>1800000</value> <!-- 30 minutes --> </property> <property> <name>mapred.job.tracker.http.address</name> <value>0.0.0.0:20030</value> <description> 50030 The job tracker http server address and port the server will listen on. If the port is 0 then the server will start on a free port. </description> </property> <property> <name>mapred.task.tracker.http.address</name> <value>0.0.0.0:20060</value> <description> 50060 </property> </configuration>

    Read the article

  • Using Python, what's the best way to create a set of files on disk for testing?

    - by Chris R
    I'm looking for a way to create a tree of test files to unit test a packaging tool. Basically, I want to create some common file system structures -- directories, nested directories, symlinks within the selected tree, symlinks outside the tree, &c. Ideally I want to do this with as little boilerplate as possible. Of course, I could hand-write the set of files I want to see, but I'm thinking that somebody has to have automated this for a test suite somewhere. Any suggestions?

    Read the article

  • modifying a cloned element and reinserting in dom - jquery

    - by Praveen Prasad
    //dom <div id='toBeCloned'><span>Some name</span></div> <div id='targetElm'></div> //js $(function () { //creating a clone var _clone = $('#toBeCloned').clone(true); // target element var _target = $('#targetElm'); //now target element is to be filled with cloned element with data of span changed var _someData = [1, 2, 3, 4]; //loop through data $.each(_someData, function (i, data) { var _newElm = {}; $.extend(_newElm, _clone);//copy cloned to new Elm _newElm.find('span').html(data); //edit content of span alert('p'); // alert added to show that append in next line inspite of adding new element to dom just updating the previous one _target.append(_newElm);//update target }); }); expected Result: 1 2 3 4 resut iam getting is 4

    Read the article

  • unable to load library at runtime in android application

    - by Addy
    Hi. I m working on android application in which I used JNI for native c code. I build this application on android 2.0 version and ndkr3. and its work fine. Now when I changed the android sdk version 1.5 and api version 3 I faced problem of unable to open library libtest_demo.so. 05-13 16:54:23.603: INFO/dalvikvm(1211): Unable to dlopen(/data/data/org.abc.test_demo/lib/libtest_demo.so): Cannot find library I put the libtest_demo.so file at the same place /data/data/org.abc.test_demo/lib/libtest_demo.so but still problem arise. please help me..

    Read the article

  • How to convert records including 'include' associations to JSON.

    - by 99miles
    If I do something like: result = Appointment.find( :all, :include => :staff ) logger.debug { result.inspect } then it only prints out the Appointment data, and not the associated staff data. If I do result[0].staff.inpsect then I get the staff data of course. The problem is I want to return this to AJAX as JSON, including the staff rows. How do I force it to include the staff rows, or do I have to loop through and create something manually?

    Read the article

  • Parsing the json and storing it in an array?

    - by Prateek Raj
    hi everyone, i'm very new to this web related problems, please help i'm working on sencha which proves to be very difficult wen it comes to json parsing . . . . so i'm planning on retrieving the data to the html page and then load it into my js file. . . so here is the problem: i've already asked about it and got a reply.. http://jsbin.com/uwuca5 but now wen i use the html source code locally in my system or even by using the IIS i couldn't parse the data. . . . . . . here is the link for my json file: http://compliantbox.com/optionsedge/sample.php i'm trying to use this link in my code and retrive the data but the data is returning null Please Help Thank you,

    Read the article

  • Which to use, XMP or RDF?

    - by zotty
    What's the difference between RDF and XMP? From what I can tell, XMP is derived from RDF... so what does it offer that RDF doesn't? My particular situation is this: I've got some images which need tagging with details of how an experiment was performed, and what sort of data analysis has been performed on the images. A colleague of mine is pushing for XMP, but he's thinking of the images as photos - they're not really, they're just bits of data. From what I've seen (mainly by opening images in notepad++) the XMP data looks very similar to RDF - even so far as using RDF in the tag names (e.g. <rdf:Seq>). I'd like this data to be usable by other people who use similar instruments for similar experiments, so creating a mini standard (schema?) seems like the way to go. Apologies for the lack of fundemental understanding - I'm a Doctor, not a programmer! If it makes any difference, the language of choice will be C#. Edit for more information: First off, thanks for the excellent replies - thinking of XMP as a vocabulary for RDF makes things a lot clearer. The sort of data I'll be storing wont be avaliable in any of the pre-defined sets. It'll detail experimental set ups, locations and results. I think using RDF is the way to go.

    Read the article

  • Adding custom columns to Propel model?

    - by Hard-Boiled Wonderland
    At the moment I am using the below query: $claims = ClaimQuery::create('c') ->leftJoinUser() ->withColumn('CONCAT(User.Firstname, " ", User.Lastname)', 'name') ->withColumn('User.Email', 'email') ->filterByArray($conditions) ->paginate($page = $page, $maxPerPage = $top); However I then want to add columns manually, so I thought this would simply work: foreach($claims as &$claim){ $claim->actions = array('edit' => array( 'url' => $this->get('router')->generate('hera_claims_edit'), 'text' => 'Edit' ) ); } return array('claims' => $claims, 'count' => count($claims)); However when the data is returned Propel or Symfony2 seems to be stripping the custom data when it gets converted to JSON along with all of the superflous model data. What is the correct way of manually adding data this way?

    Read the article

  • jquery change load get url

    - by john morris
    Ok so I have a php page with data dynamically outputs depending on a $_GET variable ['file']. I have another page (index.php) which has a jQuery script that uses the load() function to load the php page. I have a list of links and when you click on one, it needs to change the $_GET variable to load, then refresh the load() jQuery variable. Heres a snippet: $("#remote-files").load("data.php?file=wat.txt"); $(".link1").mousedown(function() { $("#remote-files").load("data.php?file=link1.txt"); }); As you can see it loads the data into a div with the ID of remote-files. Is there a better way to do this, like update the page with the new get variable instead of redefining a new load function?

    Read the article

  • Error while trying to parse a website url using python . how to debug it ?

    - by mekasperasky
    #!/usr/bin/python import json import urllib from BeautifulSoup import BeautifulSoup from BeautifulSoup import BeautifulStoneSoup import BeautifulSoup def showsome(searchfor): query = urllib.urlencode({'q': searchfor}) url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' % query search_response = urllib.urlopen(url) search_results = search_response.read() results = json.loads(search_results) data = results['responseData'] print 'Total results: %s' % data['cursor']['estimatedResultCount'] hits = data['results'] print 'Top %d hits:' % len(hits) for h in hits: print ' ', h['url'] resp = urllib.urlopen(h['url']) res = resp.read() soup = BeautifulSoup(res) print soup.prettify() print 'For more results, see %s' % data['cursor']['moreResultsUrl'] showsome('sachin') What is the wrong in this code ? Note all the 4 links that I am getting out of the search , I am feeding it back to extract the contents out of it , and then use BeautifulSoup to parse it . How should I go about it ?

    Read the article

  • WCF Callback Contract Fiddler Debugging

    - by DavyMac23
    I'm trying to debug a WCF self-hosted service utilizing callbacks. Every 1/2 I make a callback to a SL3 app to display the latest changes (yes, there are tons of changes every 1/2 second). There are 3 services, one has new data every 1/2 second, one has new data every second, and another has new data every hour. I've set all of my timeouts, Send, Receive, Open and Close to 2 days 23 hours and 23 minutes, but I still get time out errors on the service side when I go to issue the callback. I'm using Fiddler and I notice that my service that has new data every hour is still showing up every 10 seconds. Fiddlier shows a Body length of -1, then 10 seconds later it changes to 0 and the response is HTTP 200, but the overall elapsed time is 10 seconds. What's going on here?

    Read the article

  • Cluster analysis on two columns that contain name of person in R

    - by Alka Shah
    I am a beginner in R. I have to do cluster analysis in data that contains two columns with name of persons. I converted it in data frame but it is character type. To use dist() function the data frame must be numeric. example of my data: Interviewed.Type interviewed.Relation.Type 1. An1 Xuan 2. An2 The 3. An3 Ngoc 4. Bui Thi 5. ANT feed 7. Bach Thi 8. Gian1 Thi 9. Lan5 Thi . . . 1100. Xung Van I will be grateful for your help.

    Read the article

  • Store a byte[] stored in a SQL XML parameter to a varbinary(MAX) field in SQL Server 2005. Can it be

    - by Mikey John
    Store a byte[] stored in a SQL XML parameter to a varbinary(MAX) field in SQL Server 2005. Can it be done ? Here's my stored procedure: set ANSI_NULLS ON set QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[AddPerson] @Data AS XML AS INSERT INTO Persons (name,image_binary) SELECT rowWals.value('./@Name', 'varchar(64)') AS [Name], rowWals.value('./@ImageBinary', 'varbinary(MAX)') AS [ImageBinary] FROM @Data.nodes ('/Data/Names') as b(rowVals) SELECT SCOPE_IDENTITY() AS Id In my schema Name is of type String and ImageBinary is o type byte[].

    Read the article

  • python: send a list/dict over network

    - by facha
    Hi, everyone I'm looking for an easy way of packing/unpacking data structures for sending over the network: on client just before sending: a = ((1,2),(11,22,),(111,222)) message = pack(a) and then on server: a = unpack(message) Is there a library that could do pack/unpack magic? Thanks in advance

    Read the article

  • Calling Web Services Asynchronously in Page_Load Event

    - by Umar Siddique
    I'm working on a web application using VB.NET. In page load event am calling a remote web service which take time to bring the data. During this process none of the other contents on page are shown(render). I want to call this remote web service asynchronously so that other data of page is displayed and web service data will be displayed when its available.

    Read the article

  • django handling file uploads - target different than media folder

    - by Tom Tom
    Hi, I want to enable the user to upload media which will not be saved in the media folder. When I use the following line of code data will be uploaded to media/upload/logo . logo_img = models.FileField(upload_to='upload/logo', blank=True) I'm wondering how I can change this behaviour. I would try to write a custom FileField and a view that serves the data based on the database entries. I do not want to place the data the user uploads to the media folder, since it is no public data. Is this approach correct? Are there solutions out there which do exactly what I want and I would reinvent the wheel with implementing this by myself? Would appreciate any help!

    Read the article

  • graph algorithms on GPU

    - by scatman
    the current GPU threads are somehow limited (memory limit, limit of data structures, no recursion...). do you think it would be feasible to implement a graph theory problem on GPU. for example vertex cover? dominating set? independent set? max clique?.... is it also feasible to have branch-and-bound algorithms on GPUs? Recursive backtracking?

    Read the article

  • iPhone Accelerometer > csv > email

    - by Bradley Powers
    Hi all, I'm trying to collect data for a machine learning project I'm working on. What I'd like to do is collect accelerometer data from an iPhone, save it to a csv and email it to myself. My app currently is able to acquire data from the accelerometer, but I'm at a bit of a loss as to how to proceed. First of all, I'd like to acquire data for a preset amount of time (after playing a sound to the user) which I don't really know how to do, and I can't find good documentation for. Also, I'd like to save that to a csv, which there is some documentation on (specifically using the NSString writeToFile method). Any recommendations/ ideas? Thanks!

    Read the article

  • wpf legacy server call

    - by Shah Al
    Hi, We have a legacy application running tomcat that publishes data in a simple html table. I have no control on the remote server publishing the data. I am looking to extract the data into a WPF desktop application and display it as a table. Is there any way a WPF application can make a url call, get the result and parse the data. This would be similar to AJAX from JSP. Any thoughts/ideas? Please advice. Regards,

    Read the article

  • Uploading image to flicker in c++

    - by Alien01
    I am creating an application in VC++ using win32,wininet to upload an image to Flickr.I am able to get Frob,Token correctly but when I try to upload the image I am getting error Post size too large. Headers are created as follows wstring wstrAddHeaders = L"Content-Type: multipart/form-data;boundary=ABCD\r\n"; wstrAddHeaders += L"Host: api.flickr.com\r\n"; wchar_t tempStr[MAX_PATH]; wsprintf(L"Content-Length: %ld\r\n",szTotalSize); wstrAddHeaders += tmpStr; wstrAddHeaders +=L"\r\n"; HINTERNET hSession = InternetConnect(hInternet, L"www.flickr.com", INTERNET_DEFAULT_HTTP_PORT, NULL,NULL, INTERNET_SERVICE_HTTP, 0, 0); if(hSession==NULL) { dwErr = GetLastError(); return; } Content of Post request are created as follows: wstring wstrBoundry = L"--ABCD\r\n"; wstring wstrContent =wstrBoundry; wstrContent +=L"Content-Disposition: form-data; name=\"api_key\"\r\n\r\n"; wstrContent +=wstrAPIKey.c_str() ; wstrContent += L"\r\n"; wstrContent +=wstrBoundry; wstrContent +=L"Content-Disposition: form-data; name=\"auth_token\"\r\n\r\n"; wstrContent +=m_wstrToken.c_str(); wstrContent += L"\r\n"; wstrContent +=wstrBoundry; wstrContent +=L"Content-Disposition: form-data; name=\"api_sig\"\r\n\r\n"; wstrContent +=wstrSig; wstrContent += L"\r\n"; wstrContent +=wstrBoundry; wstrContent +=L"Content-Disposition: form-data; name=\"photo\"; filename=\"C:\\test.jpg\""; wstrContent +=L"\r\n"; wstrContent +=L"Content-Type: image/jpeg\r\n\r\n"; wstring wstrFilePath(L"C:\\test.jpg"); CAtlFile file; HRESULT hr = S_OK; hr = file.Create(wstrFilePath.c_str(),GENERIC_READ,FILE_SHARE_READ,OPEN_EXISTING); if(FAILED(hr)) { return; } ULONGLONG nLen; hr = file.GetSize(nLen); if (nLen > (DWORD)-1) { return ; } char * fileBuf = new char[nLen]; file.Read(fileBuf,nLen); wstring wstrLastLine(L"\r\n--ABCD--\r\n"); size_t szTotalSize = sizeof(wchar_t) * (wstrContent.length()) +sizeof(wchar_t) * (wstrLastLine.length()) + nLen; unsigned char *buffer = (unsigned char *)malloc(szTotalSize); memset(buffer,0,szTotalSize); memcpy(buffer,wstrContent.c_str(),wstrContent.length() * sizeof(wchar_t)); memcpy(buffer+wstrContent.length() * sizeof(wchar_t),fileBuf,nLen); memcpy(buffer+wstrContent.length() * sizeof(wchar_t)+nLen,wstrLastLine.c_str(),wstrLastLine.length() * sizeof(wchar_t)); hRequest = HttpOpenRequest(hSession, L"POST", L"/services/upload/", L"HTTP/1.1", NULL, NULL, 0, NULL); if(hRequest) { bRet = HttpAddRequestHeaders(hRequest,wstrAddHeaders.c_str(),wstrAddHeaders.length(),HTTP_ADDREQ_FLAG_ADD | HTTP_ADDREQ_FLAG_REPLACE); if(bRet) { bRet = HttpSendRequest(hRequest,NULL,0,(void *)buffer,szTotalSize); if(bRet) { while(true) { char buffer[1024]={0}; DWORD read=0; BOOL r = InternetReadFile(hRequest,buffer,1024,&read); if(read !=0) { wstring strUploadXML =buffer; break; } } } } I am not pretty sure the way I am adding image data to the string and posting the request. Do I need to convert image data into Unicode? Any suggestions , if someone can find what I am doing wrong that would be very helpful to me.

    Read the article

  • How do I combine these similar linq queries into one?

    - by MikeD
    This is probably a basic LINQ question. I have need to select one object and if it is null select another. I'm using linq to objects in the following way, that I know can be done quicker, better, cleaner... public Attrib DetermineAttribution(Data data) { var one = from c in data.actions where c.actionType == Action.ActionTypeOne select new Attrib { id = c.id, name = c.name }; if( one.Count() > 0) return one.First(); var two = from c in data.actions where c.actionType == Action.ActionTypeTwo select new Attrib { id = c.id, name = c.name }; if (two.Count() > 0 ) return two.First(); } The two linq operations differ only on the where clause and I know there is a way to combine them. Any thoughts would be appreciated.

    Read the article

  • Atomic swap in GNU C++

    - by Steve
    I want to verify that my understanding is correct. This kind of thing is tricky so I'm almost sure I am missing something. I have a program consisting of a real-time thread and a non-real-time thread. I want the non-RT thread to be able to swap a pointer to memory that is used by the RT thread. From the docs, my understanding is that this can be accomplished in g++ with: // global Data *rt_data; Data *swap_data(Data *new_data) { #ifdef __GNUC__ // Atomic pointer swap. Data *old_d = __sync_lock_test_and_set(&rt_data, new_data); #else // Non-atomic, cross your fingers. Data *old_d = rt_data; rt_data = new_data; #endif return old_d; } This is the only place in the program (other than initial setup) where rt_data is modified. When rt_data is used in the real-time context, it is copied to a local pointer. For old_d, later on when it is sure that the old memory is not used, it will be freed in the non-RT thread. Is this correct? Do I need volatile anywhere? Are there other synchronization primitives I should be calling? By the way I am doing this in C++, although I'm interested in whether the answer differs for C. Thanks ahead of time.

    Read the article

< Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >