Search Results

Search found 67143 results on 2686 pages for 'complex data types'.

Page 408/2686 | < Previous Page | 404 405 406 407 408 409 410 411 412 413 414 415  | Next Page >

  • Large public datasets?

    - by Jason
    I am looking for some large public datasets, in particular: Large sample web server logs that have been anonymized. Datasets used for database performance benchmarking. Any other links to large public datasets would be appreciated. I already know about Amazon's public datasets at: http://aws.amazon.com/publicdatasets/

    Read the article

  • Curve fitting: Find the smoothest function that satisfies a list of constraints.

    - by dreeves
    Consider the set of non-decreasing surjective (onto) functions from (-inf,inf) to [0,1]. (Typical CDFs satisfy this property.) In other words, for any real number x, 0 <= f(x) <= 1. The logistic function is perhaps the most well-known example. We are now given some constraints in the form of a list of x-values and for each x-value, a pair of y-values that the function must lie between. We can represent that as a list of {x,ymin,ymax} triples such as constraints = {{0, 0, 0}, {1, 0.00311936, 0.00416369}, {2, 0.0847077, 0.109064}, {3, 0.272142, 0.354692}, {4, 0.53198, 0.646113}, {5, 0.623413, 0.743102}, {6, 0.744714, 0.905966}} Graphically that looks like this: We now seek a curve that respects those constraints. For example: Let's first try a simple interpolation through the midpoints of the constraints: mids = ({#1, Mean[{#2,#3}]}&) @@@ constraints f = Interpolation[mids, InterpolationOrder->0] Plotted, f looks like this: That function is not surjective. Also, we'd like it to be smoother. We can increase the interpolation order but now it violates the constraint that its range is [0,1]: The goal, then, is to find the smoothest function that satisfies the constraints: Non-decreasing. Tends to 0 as x approaches negative infinity and tends to 1 as x approaches infinity. Passes through a given list of y-error-bars. The first example I plotted above seems to be a good candidate but I did that with Mathematica's FindFit function assuming a lognormal CDF. That works well in this specific example but in general there need not be a lognormal CDF that satisfies the constraints.

    Read the article

  • Problem performance datawarehouse with lots of indexes

    - by Lieven Cardoen
    Our product takes tests of some 350 candidates at the same time. At the end of the test, results for each candidate are moved to a datawarehouse full of indexes on it. For each test there's some 400 records to be entered in datawarehouse. So 400 x 350 is a lot of records. If there are not much records in the datawarehouse, all goes well. But if there are already lots of records in the datawarehouse, then a lot of inserts fail... Is there a way to have indexes that are only rebuild at the end of the day or isn't that the real problem? Or how would you solve this?

    Read the article

  • Appending html data when item in html select list is selected

    - by Workoholic
    I have a selector that could look like this: <label for="testselector">Test</label><br /> <select id="testselector" name="test[]" size="5" multiple="multiple"> <option name="test_1" value="1">Test Entry X</option> <option name="test_3" value="2">Test Entry Y</option> <option name="test_5" value="5">Test Entry Z</option> </select> <div id="fieldcontainer"></div> When an entry from the above fields is selected, I want two form fields to appear. I use jquery to add them: $("#fieldcontainer").append("<div><label for=\"testurl\">Test Url</label><br /><input name=\"testurl[]\" type=\"text\" id=\"testurl_1\" value=\"\" /></div>"); $("#fieldcontainer").append("<div><label for=\"testpath\">Test Path</label><br /><input name=\"testpath[]\" type=\"text\" id=\"testpath_1\" value=\"\" /></div>"); I can easily add a click handler to make those form fields appear. But how would I keep track of what form fields were already added? When multiple fields are selected, the appropriate number of input fields needs to be added to the fieldcontainer div. And if they are unselected, the input fields need to be removed again. I also need to retrieve the value from the options in the select list to add them as identifier in the added input fields...

    Read the article

  • What is better, a STL list or a STL Map for 20 entries, considering order of insertion is as importa

    - by Abhijeet
    I have the following scenario.The implementation is required for a real time application. 1)I need to store at max 20 entries in a container(STL Map, STL List etc). 2)If a new entry comes and 20 entries are already present i have to overwrite the oldest entry with the new entry. Considering point 2, i feel if the container is full (Max 20 entries) 'list' is the best bet as i can always remove the first entry in the list and add the new one at last (push_back). However, search won't be as efficient. For only 20 entries, does it really make a big difference in terms of searching efficiency if i use a list in place of a map? Also considering the cost of insertion in map i feel i should go for a list? Could you please tell what is a better bet for me ?

    Read the article

  • Retrieving related data in the Symfony Admin Generator

    - by bjoern
    I have a problem with the Admin Generator. The Table of Pages have the column sf_guard_user_id. The rest of the table looks as this part of the generator.yml in the line display, list: title: Pages display: [=title, sfGuardUser, views, state, privacy, created_at, updated_at] sort: [created_at, desc] fields: sfGuardUser: { label: Author } created_at: { label: Published, date_format: dd.MM.y } updated_at: { label: Updated, date_format: dd.MM.y } table_method: retrieveUserList Now the sf_guard_user_id is been replaced and the username ist displayed. Don't get me wrong, that works fine. But how can I get other variables from the sfGuarsUser relation? When I only add salt or another variable to display I get this, Unknown record property / related component "salt" on "simplePage" But why?

    Read the article

  • C# datatable to listview

    - by Data-Base
    I like to be able to view datatable in windows form I managed to get the headers only with ListView how to get the data in there DataTable data = new DataTable(); data = EnumServices(); //create headers foreach (DataColumn column in data.Columns) { listView_Services.Columns.Add(column.ColumnName); } I just want to show now the data in there! cheers

    Read the article

  • how to use an mdf in App_Data with shared hosting

    - by name
    If I create a website that uses an mdf in App_Data with the connection string: Server=.\SQLExpress;AttachDbFilename=|DataDirectory|mydbfile.mdf;Database=dbname; Trusted_Connection=Yes; what do I need to do to run the site in a shared hosting environment? Do I need to copy the contents of my mdf to the main SQL Server engine of my host? Is there a way to use the non-SQLExpress engine of my host and still keep my mdf in my App_Data?

    Read the article

  • Magento resource model for table with compound primary key

    - by sdek
    I am creating a custom module for a Magento ecommerce site, and the module will center around a new (ie, custom) table that has a compound/composite primary key, or rather the table has two columns that make up the primary key. Does anybody know how to create your models/resource models based on a table with a compound key? To give a few more details, I have looked up several tutorials and also used the excellent moduleCreator script. But it seems like all the tutorials revolve around the table having a PK with just one column in it. Something like this: class <Namespace>_<Module>_Model_Mysql4_<Module> extends Mage_Core_Model_Mysql4_Abstract { public function _construct(){ $this->_init('<module_alias>/<table_alias>', '<table_primary_key_id>'); } } Also, I just noticed that looking at the database model almost all tables have a single primary key. I understand this has much to do with the EAV-style db structure, but still is it possible to use a table with a compound PK? I want to stick with the Magento framework/conventions if possible. Is it discouraged? Should I just change the structure of my custom table to have some dummy id column? I do have the ability to do that, but geez! (Another side note that I thought I would mention is that it looks like the Zend Framework provides a way to base a class on a table with compound primary key (see Example #20 on this page - about half-way down), so it seems that the Magento framework should also provide for it... I just don't see how.)

    Read the article

  • sql-server: how to select from dupilcate rows from table?

    - by RedsDevils
    Hi All, I have the following table. CREATE TABLE TEST(ID TINYINT NULL, COL1 CHAR(1)) INSERT INTO TEST(ID,COL1) VALUES (1,'A') INSERT INTO TEST(ID,COL1) VALUES (2,'B') INSERT INTO TEST(ID,COL1) VALUES (1,'A') INSERT INTO TEST(ID,COL1) VALUES (1,'B') INSERT INTO TEST(ID,COL1) VALUES (1,'B') INSERT INTO TEST(ID,COL1) VALUES (2,'B') I would like to select duplicate rows from that table. How Can I select? I try like the following: SELECT TEST.ID,TEST.COL1 FROM TEST WHERE TEST.ID IN (SELECT ID FROM TEST WHERE TEST.COL1 IN (SELECT COL1 FROM TEST WHERE TEST.ID IN (SELECT ID FROM TEST GROUP BY ID HAVING COUNT(*) > 1) GROUP BY COL1 HAVING COUNT(*) > 1) GROUP BY ID HAVING COUNT(*) > 1) Where's the Error? Can you modify that? Help me! Thanks in advance!

    Read the article

  • Pipe data from InputStream to OutputStream in Java

    - by Wangnick
    Dear all, I'd like to send a file contained in a ZIP archive unzipped to an external program for further decoding and to read the result back into Java. ZipInputStream zis = new ZipInputStream(new FileInputStream(ZIPPATH)); Process decoder = new ProcessBuilder(DECODER).start(); ??? BufferedReader br = new BufferedReader(new InputStreamReader( decoder.getInputStream(),"us-ascii")); for (String line = br.readLine(); line!=null; line = br.readLine()) { ... } What do I need to put into ??? to pipe the zis content to the decoder.getOutputStream()? I guess a dedicated thread is needed, as the decoder process might block when its output is not consumed.

    Read the article

  • AS3: How to access pixel data efficiently?

    - by JonoRR
    I'm working a game. The game requires entities to analyse an image and head towards pixels with specific properties (high red channel, etc.) I've looked into Pixel Bender, but this only seems useful for writing new colors to the image. At the moment, even at a low resolution (200x200) just one entity scanning the image slows to 1-2 Frames/second. I'm embedding the image and instance it as a Bitmap as a child of the stage. The 1-2 FPS situation is using BitmapData.getPixel() (on each pixel) with a distance calculation beforehand. I'm wondering if there's any way I can do this more efficiently... My first thought was some sort of spatial partioning coupled with splitting the image up into many smaller pieces. I also feel like Pixel Bender should be able to help somehow, however I've had little experience with it. Cheers for any help. Jonathan

    Read the article

  • Case insensitive duplicates SQL

    - by hdx
    So I have a users table where the user.username has many duplicates like: username and Username and useRnAme john and John and jOhn That was a bug and these three records should have been only one. I'm trying to come up with a SQL query that lists all of these cases ordered by their creation date, so ideally the result should be something like this: username jan01 useRnAme jan02 Username jan03 john feb01 John feb02 jOhn feb03 Any suggestions will be much appreciated

    Read the article

  • Interpolation of time series data in R

    - by Pierreten
    I'm not sure what i'm missing here, but i'm basically trying to compute interpolated values for a time series; when I directly plot the series, constraining the interpolation points with "interpolation.date.vector", the plot is correct: plot(date.vector,fact.vector,ylab='Quantity') lines(spline(date.vector,fact.vector,xout=interpolation.date.vector)) When I compute the interpolation, store it in an intermediate variable, and then plot the results; I get a radically incorrect result: intepolated.values <- spline(date.vector,fact.vector,xout=interpolation.date.vector) plot(intepolated.values$x,intepolated.values$y) lines(testinterp$x,testinterp$y) Doesn't the lines() function have to execute the spline() function to retrieve the interpolated points in the same way i'm doing it?

    Read the article

  • javascript return function's data as a file

    - by Dennis
    I have a function in javascript called "dumpData" which I call from a button on an html page as **onlick="dumpData(dbControl);"* What it does is return an xml file of the settings (to an alert box right now). I want to return it to the user as a file download. Is there a way to create a button when click will open a file download box and ask the user to save or open it? (sorta of like right-clicking and save target as)... Or can it be sent to a php file and use export();? Not sure how I would send a long string like that to php and have it simple send it back as a file download. Dennis

    Read the article

  • Data in linux FIFO seems lost

    - by Utoah
    Hi, I have a bash script which wants to do some work in parallel, I did this by putting each job in an subshell which is run in the background. While the number of job running simultaneously should under some limit, I achieve this by first put some lines in a FIFO, then just before forking the subshell, the parent script is required to read a line from this FIFO. Only after it gets a line can it fork the subshell. Up to now, everything works fine. But when I tried to read a line from the FIFO in the subshell, it seems that only one subshell can get a line, even if there are apparently more lines in the FIFO. So I wonder why cannot other subshell(s) read a line even when there are more lines in the FIFO. My testing code looks something like this: #!/bin/sh fifo_path="/tmp/fy_u_test2.fifo" mkfifo $fifo_path #open fifo for r/w at fd 6 exec 6 $fifo_path process_num=5 #put $process_num lines in the FIFO for ((i=0; i<${process_num}; i++)); do echo "$i" done &6 delay_some(){ local index="$1" echo "This is what u can see. $index \n" sleep 20; } #In each iteration, try to read 2 lines from FIFO, one from this shell, #the other from the subshell for i in 1 2 do date /tmp/fy_date #If a line can be read from FIFO, run a subshell in bk, otherwise, block. read -u6 echo " $$ Read --- $REPLY --- from 6 \n" /tmp/fy_date { delay_some $i #Try to read a line from FIFO read -u6 echo " $$ This is in child # $i, read --- $REPLY --- from 6 \n" /tmp/fy_date } & done And the output file /tmp/fy_date has content of: Mon Apr 26 16:02:18 CST 2010 32561 Read --- 0 --- from 6 \n Mon Apr 26 16:02:18 CST 2010 32561 Read --- 1 --- from 6 \n 32561 This is in child # 1, read --- 2 --- from 6 \n

    Read the article

  • Updates to NSDictionary attribute in CoreData not saving

    - by sfkaos
    I have created an Entity in CoreData that includes a Transformable attribute type implemented as an NSDictionary. The NSDictionary attribute only contains values of a custom class. The properties of the custom class are all of type NSString. The custom class complies with NSCoding implementing: -(void)encodeWithCoder:(NSCoder*)coder; -(id)initWithCoder:(NSCoder *)coder When saving the Entity for the first time all attributes including the Transformable (NSDictionary) type are properly saved in the DB. When the same Entity is fetched from the DB and updated (including the Transformable attribute) it seems to be updated properly. However, when the app is closed and then reopened fetching the Entity does not show the updated Transformable attribute-type though the rest of the attributes of type NSDate and NSString are up-to-date. The Transformable attribute is the original saved value not the updated value. Is this a problem with KVO or am I missing something else when trying to save an NSDictionary filled with a custom class to CoreData?

    Read the article

< Previous Page | 404 405 406 407 408 409 410 411 412 413 414 415  | Next Page >