Search Results

Search found 64715 results on 2589 pages for 'spring data rest'.

Page 279/2589 | < Previous Page | 275 276 277 278 279 280 281 282 283 284 285 286  | Next Page >

  • large test data for knapsack problem

    - by user347918
    i am researcher student. I am searching large data for knapsack problem. I wanted test my algorithm for knapsack problem. But i couldn't find large data. I need data has 1000 item and capacity is no matter. The point is item as much as huge it's good for my algorithm. Is there any huge data available in internet. Does anybody know please guys i need urgent.

    Read the article

  • PageableListView Not rendering my data as required

    - by Robin
    i am working on wicket, where i am supposed to show my data's under <tr> <td>Name</td> <td>Single Player Score</td> <td>Double Player Score</td> <td>Total Score</td> </tr> <tr wicket:id="data"> <td wicket:id="name"></td> <td wicket:id="singlePlayerScore"></td> <td wicket:id="doublePlayerScore"></td> <td wicket:id="totalScore"></td> </tr> My Player model class is as: Player class with attributes singlePlayerScore, doublePlayerScore(), name with getter and setter and also a list data obtained from database. Data from SQLQuery is as; name score gamemode A 200 singlePlayerMode A 100 doublePLayerMode B 400 singlePlayerMode B 300 doublePLayerMode dataList == player.getScoreList(); My PageableListView is as: final PageableListView listView = new PageableListView("data",dataList,10){ @Override protected void populateItem(Item item){ player = (Player)item.getModelObject(); item.add(Label("name",player.getName())); item.add(Label("singlePlayerScore",player.getName())); item.add(Label("doublePlayerScore",player.getName())); item.add(Label("totalScore",String.valueOf(player.getSinglePlayerScore()+player.getDoublePlayerScore()))); } } My Problem is as: What view i get is as: Name single Player Score Double Player Score Total Score A 0 100 100 A 200 0 200 B 0 300 300 B 400 0 400 How do i achieve below view on my webpage? Name single Player Score Double Player Score Total Score A 200 100 300 B 400 300 700 Please help me as to why is this happening? I guess my list has size four that's one reason why as to it is rendering the view? So what can i do to get as require rendering view?

    Read the article

  • Decide which caching startegy to use ?

    - by hib
    Hi all, I want to cache my loaded data so that I can reduce my application start time . I know several strategies to store application data i.e. core data, nsuserdefaults , archiving . Now my scenario is that suppose that I have array of maximum 10 objects each object having 5 fields . So I can not decide which strategy to store this array an later retrieving the same . Thanks .

    Read the article

  • Why is hibernate returning a proxy object?

    - by predhme
    I have a service method that returns an object from the database. This method is called from numerous parts of the system. However, one particular method is getting a return type of ObjectClass_$$_javassist_somenumber as the type. Which is throwing things off. I call the service method exactly the same as everywhere else, so why would hibernate return the proxy as opposed to the natural object? I know there are ways to expose the "proxied" object, but I don't feel like I should have to do that. The query is simply hibernateTemplate.find("from User u where u.username = ?", username)

    Read the article

  • do.call(rbind, list) for uneven number of column

    - by h.l.m
    I have a list, with each element being a character vector, of differing lengths I would like to bind the data as rows, so that the column names 'line up' and if there is extra data then create column and if there is missing data then create NAs Below is a mock example of the data I am working with x <- list() x[[1]] <- letters[seq(2,20,by=2)] names(x[[1]]) <- LETTERS[c(1:length(x[[1]]))] x[[2]] <- letters[seq(3,20, by=3)] names(x[[2]]) <- LETTERS[seq(3,20, by=3)] x[[3]] <- letters[seq(4,20, by=4)] names(x[[3]]) <- LETTERS[seq(4,20, by=4)] The below line would normally be what I would do if I was sure that the format for each element was the same... do.call(rbind,x) I was hoping that someone had come up with a nice little solution that matches up the column names and fills in blanks with NAs whilst adding new columns if in the binding process new columns are found...

    Read the article

  • Pass data in np.dnarray to Highcharts

    - by F.N.B
    I'm working with python 2.7, jinja2, flask and Highcharts. I create two numpy array (x1 and x2, type = numpy.dnarray) and I pass to Highcharts. My problems is, Highcharts don't recognize the commas in the vector. This is my jinja2 code: <script> $(function () { $('#container').highcharts({ series: [{ name: 'Tokyo', data: {{ x1 }} }, { name: 'London', data: {{ x2 }} }] }); }); And this is the error that I look with network chrome dev tools: series: [{ name: 'Tokyo', data: [1 4 5 2 3] }, { name: 'London', data: [3 6 7 4 1] }] I need change the numpy array to python list to pass to Highcharts or there is a better way to do?? Thanks

    Read the article

  • DATA REQUEST IN SMALLER CHUNKS?

    - by Googler
    Hi folks, I have developed a windows services to retrieve all response data provided by the EPO webservices. While scrapping the data through internet, after few hours i recieve a error message as: Error: ** Please request bibliographic data in smaller chunks.** Here bibliographic data is one kind of the service provided by the EPO webservice. I hope there is no error with my inputs and the service provided. I dont know what this error means. Is it related to the internet connection or with my webservices? can anyone please help me on what this error actually mean?

    Read the article

  • Migrating Data to MSSQL 2008

    - by Fred Clown
    I am trying to migrate data from an Informix database to MSSQL 2008. I've got quite a lot of data to move. I've been try multiple methods to get the data over, and so far SQLBulkCopy in multiple chunks seems to be the fastest that I can find. Does anyone know of a faster means of getting the data over? I'm trying to cut down on the transfer time so that on my cut-over date I don't run out of time to do the full cut-over. Thanks.

    Read the article

  • Correcting tree from messed up file tree in NTFS partition

    - by Fullmooninu
    It's a real messed situation, but I'm quite at the end of my options. It's my personal hardrive, so it's very important for me, and yes, I have no backup =( The short story: 1) I have two discs. One with Windows, and another where I had a bit of empty space at the front of the disk, so i could install Linux. The rest was occupied by a 1.8TB NTFS partition filled with data. 2) I installed Linux, and after a while realized there was not enough space for everything, so I tried using Gparted, and told it to re-size the NTFS partition, to a lesser size. 3) The system jammed. I had to reboot and broke the Resizing operation. Here's what I did to fix it: a) Rebooted into Linux Live, and used Testdisk,to deep analyze the disk, and recover the possible partitions. It found several versions of the NTFS partitions, probably made during the resizing. I told Testdisk to open every one of them, and only one could list its files. When trying to open the other options on Testdisk, it showed an error message. I assumed the one without errors, to be the correct one, and I told Testdisk to recover the partition, and write a new MBR. b) The partition had errors, and Linux has a NTFS fixing tool, used it, but the system still had errors. c) So I booted into windows and use chkdsk to correct all errors in the partition. d) Everything seems fine, but now, back in Windows, when I open one file, it opens another file, or part of another file. As in, some files took up the position of other files. What I think happened is that I recovered an old tree, and not the most current one. And that one just happened to be intact, while the most recent one was damaged. As such, the files that were moved during the failed resizing, were now, during the automatic correction, assumed wrongly to be in their correct places. So when I open a file, it tries to open another one. Radiohead - Creep.mp3 will open and it will actually be a bit from another song, or even code from a jpg. Some files seem to be all right, but others have seemed to have had their position taken by others. Anyone knows of something really powerful that can help me solve this?

    Read the article

  • Cannot create a new VS data connection in Server Explorer

    - by Seventh Element
    I have a local instance of SQL Server 2008 express edition running on my development PC. I'm trying to create a new data connection through Visual Studio Server Explorer. The steps are the following: Right click the "Data Connections" node = Choose Data Source. I select "Microsoft SQL Server" as the data source. The "Add Connection" dialog window appears. I select my local server instance = "Test connection" works fine. I select "AdventureWorks" as the database name = "Test connection" works fine. Next I hit the "Ok" button = Error message: "This server version is not supported. Only servers up to MS SQL Server 2005 are supported." I'm using Visual Studio 2008 Professional Edition. The target framework of the application is ".NET framework 3.5". I have a reference to System.Data (framework v2.0) and cannot find another version of the assembly on my system. Am I referencing the wrong assembly? How can I fix this problem?

    Read the article

  • Hibernate one to one with multiple columns

    - by Erdem Emekligil
    How can i bind two columns, using @OneToOne annotation? Lets say I've 2 tables A and B. Table A: id1 (primary key) id2 (pk) other columns Table B: id1 (pk) id2 (pk) other columns In class A i want to write something like this: @OneToOne(fetch = FetchType.EAGER, targetEntity = B.class) @JoinColumn(name = "id1 and id2", referencedColumnName = "id1 and id2") private B b; Is it possible to do this using annotations? Thanks.

    Read the article

  • Webservice for uploading data: security considerations

    - by Philip Daubmeier
    Hi everyone! Im not sure about what authentification method I should use for my webservice. I've searched on SO, and found nothing that helped me. Preliminary Im building an application that uploads data from a local database to a server (running my webservice), where all records are merged and stored in a central database. I am currently binary serializing a DataTable, that holds a small fragment of the local database, where all uninteresting stuff is already filtered out. The byte[] (serialized DataTable), together with the userid and a hash of the users password is then uploaded to the webservice via SOAP. The application together with the webservice already work exactly like intended. The Problem The issue I am thinking about is now: What is if someone just sniffs the network traffic, 'steals' the users id and password hash to send his own SOAP message with modified data that corrupts my database? Options The approaches to solving that problem, I already thought of, are: Using ssl + certificates for establishing the connection: I dont really want to use ssl, I would prefer a simpler solution. After all, every information that is transfered to the webservice can be seen on the website later on. What I want to say is: there is no secret/financial/business-critical information, that has to be hidden. I think ssl would be sort of an overkill for that task. Encrypting the byte[]: I think that would be a performance killer, considering that the goal of the excercise was simply to authenticate the user. Hashing the users password together with the data: I kind of like the idea: Creating a checksum from the data, concatenating that checksum with the password-hash and hashing this whole thing again. That would assure the data was sent from this specific user, and the data wasnt modified. The actual question So, what do you think is the best approach in terms of meeting the following requirements? Rather simple solution (As it doesnt have to be super secure; no secret/business-critical information transfered) Easily implementable retrospectively (Dont want to write it all again :) ) Doesnt impact to much on performance What do you think of my prefered solution, the last one in the list above? Is there any alternative solution I didnt mention, that would fit better? You dont have to answer every question in detail. Just push me in the right direction. I very much appreciate every well-grounded opinion. Thanks in advance!

    Read the article

  • Playback audio data with GWT

    - by Henrik
    I am creating a GWT client application which interacts with a server and I am getting all my response data from the server in JSON format. Amongst others there are wave data on the server's database which I would like to retrieve and then playback on the client. I am able to get the wave data as an array of bytes in the JSON format. My problem is, how do I playback the wave array data in a browser? Is it even possible or do I have to find another solution? I've searched the web and found some GWT packages which are able to playback sound, but they are all playing back directly from an url.

    Read the article

  • send data to server without ajax

    - by PramodD
    Hi I want to send data to server in post method. I know it is simple question but I didn't get well document for this. I had method which sends data to server but it is using ajax. I want to send data without ajax how to do that? Here is my code. $.ajax({ type: "POST", contentType:"application/x-www-form-urlencoded; charset=UTF-8", url: clientDetailURL, data: finalclientDetailParam }).done(function( msg1 ) { var clientDetailResponse = msg1; console.log("Client detail response is:"+clientDetailResponse); });

    Read the article

  • Pros and Cons of Access Data Project (MS Access front end with SQL Server Backend)

    - by webworm
    I have been tasked with moving an existing MS Access application (mdb) over to an Access Data Project (adp). Basically the Access forms will remain the same but the data will be migrated over to SQL Server. I am not too familiar with Access Data Projects so I was hoping I could get some opinions on the pros and cons of using them. My first thought was to convert this to a web application or even a Winform application, however I really wanted to perform due dilligence in looking at Access Data Projects before making a decision. Thanks for any assistance.

    Read the article

  • recv receiving not whole data sometime

    - by milo
    hi all, i have following issue: here is the chunk of code: void get_all_buf(int sock, std::string & inStr) { int n = 1; char c; char temp[1024*1024]; bzero(temp, sizeof(temp)); n = recv(sock, temp, sizeof(temp), 0); inStr = temp; }; but sometimes recv returning not whole data (data length always less then sizeof(temp)), only it's part. write side always sends me whole data (i got it with sniffer). what matter? thx. P.S. i know, good manner suggests me to check n (if (n < 0) perror ("error while receiving data), but it doesn't matter now - it's not reason of my problem. P.S.2 i've forgot - it's blocking socket.

    Read the article

  • hibernate - uniqueResult silently fails

    - by robinmag
    I have a login controller that use the hibernate uniqueResult method. Everything works fine when i test it in eclipse's tomcat server. But when i deploy my webapps to tomcat server (on the same machine) it fails: it always returns null even i use the correct credential. Here is my hibernate code: session.createCriteria(User.class) .add(Restrictions.eq(User.USERNAME_FIELD, userName)) .add(Restrictions.eq(User.PASSWORD_FIELD, password)).uniqueResult(); Thank you!

    Read the article

  • Pulling $_GET data and creating multidimensional array using loop

    - by Chris J
    I'm creating a checkout for customers and the data about what's in their cart is being sent to a page (for just now) via $_GET. I want to extract that data and then populate a multidimensional array with it using a loop. Here's how I'm naming the data: $itemCount = $_GET['itemCount']; $i = 1; while ($i <= $itemCount) { ${'item_name_'.$i} = $_GET["item_name_{$i}"]; ${'item_quantity_'.$i} = $_GET["item_quantity_{$i}"]; ${'item_price_'.$i} = $_GET["item_price_{$i}"]; //echo "<br />Name: " .${'item_name_'.$i}. " - Quantity: " .${'item_quantity_'.$i}. " - Price: ".${'item_price_'.$i}; $i++; } From here I'd like to create a multidimensional array like such: Array ( [Item_1] => Array ( [item_name] => Shoe [item_quantity] => 2 [item_price] => 40.00 ) [Item_2] => Array ( [item_name] => Bag [item_quantity] => 1 [item_price] => 60.00 ) [Item_3] => Array ( [item_name] => Parrot [item_quantity] => 4 [item_price] => 90.00 ) . . . ) What I'd like to know is if there is a way I can create this array in the existing while loop? I'm aware of being able to add data to an array like $data = [] after delacring an empty array but the actual syntax eludes me. Maybe I'm completely off the right track and there is a better way of doing it? Thanks

    Read the article

  • How do I reload a Roo project without clearing the database?

    - by Omniwombat
    I've been learning how to build projects using Roo and am making good progress. I have the nucleus of a project which correctly displays my defined entities and allows me to create, edit, and delete the representative objects. I am using mysql at the database and I see that objects entered using the UI correctly appear in the mysql database. Per the Roo instructions, I am starting the webapp using "mvn tomcat:run". Unfortunately, I've discovered that whenever I restart Tomcat using Maven, it clears all of the existing objects out of the database. I'm left with empty tables. It seems to do this as the final step just prior to Tomcat stating that the server has started. I know this is just me being a n00b, but searches haven't been very helpful, none of the project's XML files seem relevant,

    Read the article

  • Error: Get JSON Data from Web API Using Jquery

    - by Kenneth
    I'm really new at this. And I'm really stuck. I have the jquery code, it will load data from Web API, but it does not display on my page. $.getJSON("/api/Order", function(data) { if (data != null) { var str = ''; $.each(data, function (item) { str = '<li>' + item.ItemName + '</li>'; }); $("#contents").append(str); } }); Can anyone explain what is going on? Thanks.

    Read the article

  • What data is sent to Google Analytics?

    - by Darryl Hein
    Has anyone found any documentation or research about what data is transfered to Google Analytics when it's added to a site. The main thing I'm wondering about is post data, but the details of exactly what is sent would be useful. I'm considering implementing it on a sites that have a lot of private data on them. I'm wondering what data Google will capture, if any. (The sites are login only.) I'm needing proof so I can provided to the users.

    Read the article

  • Why is simulink data type conversion block altering the data when it should be typecasting?

    - by Nick
    I am attempting to typecast some data from int32 to single. I first tried using the 'Data Type Conversion' block with single output data type and the Stored Integer option. However, I found that the datatype conversion block is not typecasting the data the way I expect it to. Am I using the block incorrectly, or is it failing to work as it should? temp1 (pre conversion): uint32: 1405695244 single: 1728356810752.000000 binary: 01010011110010010011010100001100 temp2 (post conversion): uint32: 1319604842 single: 1405695232.000000 binary: 01001110101001111001001001101010 By the way, I have gotten around the issue by using an embedded Matlab block to perform the typecasting operation.

    Read the article

< Previous Page | 275 276 277 278 279 280 281 282 283 284 285 286  | Next Page >