Search Results

Search found 61631 results on 2466 pages for 'duplicate data'.

Page 415/2466 | < Previous Page | 411 412 413 414 415 416 417 418 419 420 421 422  | Next Page >

  • Count of products NOT sold...per store, per day over the past month

    - by user1893510
    I'm struggling with an interview question. 3 dimension tables (Product, Store and Date) and 1 fact table (Sales). The question asks for a T-SQL solution that will return the count of products not sold, per store, per day over the past month. At this point, my answer is futile but I've spent significant time trying to back into a solution, to no avail, and would like to close the loop. Any guidance is greatly appreciated.

    Read the article

  • Creating tables and inserting data (MySQL Dump) using PHP Doctrine 1.2

    - by Dimitry
    Hello. I got a script that create a new database, now I need to fill that database with tables and values (from a MySQL dump file). I'm using PHP - Doctrine 1.2. Here is how I create the database: $manager = Doctrine_Manager::getInstance(); $newConn = $manager->openConnection($customer->Config->db_connection_string); $newConn->createDatabase(); How I do this? Thanks!

    Read the article

  • Why does dynamic array always double by a factor of 2?

    - by Phoenix
    I was wondering how does one decide the resizing factor by which dynamic array resizes ? On wikipedia and else where I have always seen the number of elements being increased by a factor of 2? Why 2? Why not 3? how does one decide this factor ? IF it is language dependent I would like to know this for Java.

    Read the article

  • CoreData is saving model without the save: method being called

    - by chris
    I have a list of items with a plus button in the navigation bar that opens up a modal window containing a table of the models attributes, that displays a form when the table items are clicked (pretty standard form style). For some reason if I click the plus button to open the form to create a new model, then immediately click the done button, the person model is saved. The action linked to the done button does nothing but call on a delegate method notifying the personListViewController to close the window. The apple docs do state that the model is not saved after calling the insertNewObjectForEntityName: ... Simply creating a managed object does not cause it to be saved to a persistent store. The managed object context acts as a scratchpad. I am at a lost to why this is happening, but every time I click the done button I have a new blank item in the original tableView. I am using SDK v3.1.3 // PersonListViewController - open modal window - (void)addPerson { // Load the new form PersonNewViewController *newController = [[PersonNewViewController alloc] init]; newController.modalFormDelegate = self; [self presentModalViewController:newController animated:YES]; [newController release]; } // PersonFormViewController - (void)viewDidLoad { [super viewDidLoad]; if ( person == nil ) { MyAppDelegate *appDelegate = [[UIApplication sharedApplication] delegate]; self.person = [NSEntityDescription insertNewObjectForEntityForName:@"Person" inManagedObjectContext:appDelegate.managedObjectContext]; } ... UIBarButtonItem *doneButton = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemDone target:self action:@selector(done)]; self.navigationItem.rightBarButtonItem = doneButton; [doneButton release]; } - (IBAction) done { [self.modalFormDelegate didCancel]; } // delegate method in the original listViewController - (void) didCancel { [self dismissModalViewControllerAnimated:YES]; }

    Read the article

  • convert date error when retrieving data in twig page through doctrine

    - by user201892
    I just find this error when I retrieve all my records on twig page from the database through doctrine here is my twig code : {% extends "gestionConferenceApplicationBundle::layout.html.twig" %} {% block title "Hello " ~ name %} {% block content %} je suis un debutant <table border=2 > <th>Numéro</th> <th>Titre</th> <th>Ville</th> <th>Lieu</th> <th>Date de début</th> <th>Date de fin</th> {% for item in conferences %} <tr> <td>{{ item.id }}</td> <td>{{ item.titre }}</td> <td>{{ item.ville }}</td> <td>{{ item.lieu }}</td>* <td>{{ item.dateDebut }}</td> <td>{{ item.dateFin }}</td> </tr> {% endfor %} </table> {% endblock %} the error in the date : here it is : An exception has been thrown during the rendering of a template ("Catchable Fatal Error: Object of class DateTime could not be converted to string in C:\wamp\www\Symfony\app\cache\dev\twig\3c\dd\40a703549de9b8769fa40b82230e.php line 72") in gestionConferenceApplicationBundle:acceuil:acceuil.html.twig at line 19. do you have any idea how to convert this date or any other things in the MySql I have dateTime field and in doctrine I have : /** * @var \DateTime $dateDebut * * @ORM\Column(name="date_debut", type="datetime", nullable=false) */ private $dateDebut;

    Read the article

  • ASP.NET - Update progress while processing data

    - by Danny
    Hope I can explain this correctly. I have a process, that outputs step by step messages (i.e., Processing item 1... Error in item 2 etc etc). I want this to be outputted to the user during the process, and not at the end. I pretty sure i need to do this with threading, but can't find a decent example. Thanks in advanced.

    Read the article

  • Best datastructure for frequently queried list of objects

    - by panzerschreck
    Hello, I have a list of objects say, List. The Entity class has an equals method,on few attributes ( business rule ) to differentiate one Entity object from the other. The task that we usually carry out on this list is to remove all the duplicates something like this : List<Entity> noDuplicates = new ArrayList<Entity>(); for(Entity entity: lstEntities) { int indexOf = noDuplicates.indexOf(entity); if(indexOf >= 0 ) { noDuplicates.get(indexOf).merge(entity); } else { noDuplicates.add(entity); } } Now, the problem that I have been observing is that this part of the code, is slowing down considerably as soon as the list has objects more than 10000.I understand arraylist is doing a o(N) search. Is there a faster alternative, using HashMap is not an option, because the entity's uniqueness is built upon 4 of its attributes together, it would be tedious to put in the key itself into the map ? will sorted set help in faster querying ? Thanks

    Read the article

  • IndexOutOfRangeException when a stream is a multiple of the buffer size

    - by dnord
    I don't have a lot of experience with streams and buffers, but I'm having to do it for a project, and I'm stuck on an exception being thrown when the stream I'm reading is a multiple of the buffer size I've chosen. Let me show you: My code starts by reading bufferSize (100, let's say) bytes from the stream: numberOfBytesRead = DataReader.GetBytes(0, index, output, 0, bufferSize); Then, I loop through a while loop: while (numberOfBytesRead == bufferSize) { BufferWriter.Write(output); BufferWriter.Flush(); index += bufferSize; numberOfBytesRead = DataReader.GetBytes(0, index, output, 0, bufferSize); } ... and, once we get to a non-bufferSize read, we know we've hit the end of the stream and can move on. But if the bufferSize is 100, and the stream is 200, we'll read positions 0-99, 100-199, and then the attempt to read 200-299 errors out. I'd like it if it returned 0, but it throws an error. What I'm doing to handle that is, well, a try-catch: catch (System.IndexOutOfRangeException) numberOfBytesRead = 0; ...which ends the loop, and successfully finishes the thing, but we all know I don't want to control code flow with error handling. Is there a better (more standard?) way to handle stream reading when the stream length is unknown? This seems like a small wrinkle in a fairly reasonable strategy for reading streams, but I just don't know if I've got it wrong or what. The specifics of this (which I've cleaned up a little bit for posting) are a MySqlDataReader hitting a LARGEBLOB column. It's working whenever the buffer is larger than the number of returned bytes, or when the number of returned bytes is not a multiple of bufferSize. Because we don't, in that case, throw an IndexOutOfRangeException.

    Read the article

  • How to optimize this Python code?

    - by RandomVector
    def maxVote(nLabels): count = {} maxList = [] maxCount = 0 for nLabel in nLabels: if nLabel in count: count[nLabel] += 1 else: count[nLabel] = 1 #Check if the count is max if count[nLabel] > maxCount: maxCount = count[nLabel] maxList = [nLabel,] elif count[nLabel]==maxCount: maxList.append(nLabel) return random.choice(maxList) nLabels contains a list of integers. The above function returns the integer with highest frequency, if more than one have same frequency then a randomly selected integer from them is returned. E.g. maxVote([1,3,4,5,5,5,3,12,11]) is 5

    Read the article

  • How to open new window and write data into that

    - by B. Kumar
    I want to write some selected date in another window. For that i wrote JavaScript code like that.... function Writer() { winobj=window.open('','','toolbar=no,status=no,scrollbars=yes,width=800,height=600'); var txt=''; if(document.selection) { txt = document.selection.createRange().text winobj.document.selection.createRange().text = txt; } else { return } } This function working well with Internet Explorer but not working with Firefox or Crome.... Anyone plz modify this code or plz suggest me another code so that i can work with all browser. Thanks

    Read the article

  • minLength data validation is not working with Auth component for CakePHP

    - by grokker
    Let's say I have a user registration and I'm using the Auth component (/user/register is allowed of course). The problem is if I need to set a minLength validation rule in the model, it doesn't work since the Auth component hashes the password therefore it's always more than my minlength password and it passes even if it's blank. How do I fix this issue? Thanks in advance!

    Read the article

  • Big-O for GPS data

    - by HH
    A non-critical GPS module use lists because it needs to be modifiable, new routes added, new distances calculated, continuos comparisons. Well so I thought but my team member wrote something I am very hard to get into. His pseudo code int k =0; a[][] <- create mapModuleNearbyDotList -array //CPU O(n) for(j = 1 to n) // O(nlog(m)) for(i =1 to n) for(k = 1 to n) if(dot is nearby) adj[i][j]=min(adj[i][j], adj[i][k] + adj[k][j]); His ideas transformations of lists to tables His worst case time complexity is O(n^3), where n is number of elements in his so-called table. Exception to the last point with Finite structure: O(mlog(n)) where n is number of vertices and m is an arbitrary constants Questions about his ideas why to waste resources to transform constantly-modified lists to table? Fast? only point where I to some extent agree but cannot understand the same upper limits n for each for-loops -- perhaps he supposed it circular why does the code take O(mlog(n)) to proceed in time as finite structure? The term finite may be wrong, explicit?

    Read the article

  • Mysql: create index on 1.4 billion records

    - by SiLent SoNG
    I have a table with 1.4 billion records. The table structure is as follows: CREATE TABLE text_page ( text VARCHAR(255), page_id INT UNSIGNED ) ENGINE=MYISAM DEFAULT CHARSET=ascii The requirement is to create an index over the column text. The table size is about 34G. I have tried to create the index by the following statement: ALTER TABLE text_page ADD KEY ix_text (text) After 10 hours' waiting I finally give up this approach. Is there any workable solution on this problem? UPDATE: the table is unlikely to be updated or inserted or deleted. The reason why to create index on the column text is because this kind of sql query would be frequently executed: SELECT page_id FROM text_page WHERE text = ?

    Read the article

  • two HashMap iteration

    - by user431276
    I have two HashMaps and I can iterate both hashmaps with following code Iterator it = mp.entrySet().iterator(); while (it.hasNext()) { Map.Entry pairs = (Map.Entry)it.next(); String firstVal = pairs.getValue(); } Iterator it2 = mp2.entrySet().iterator(); while (it2.hasNext()) { Map.Entry pairs2 = (Map.Entry)it.next(); String SecondVal = pairs2.getValue(); } myFunction(firstVal, SecondVal) Is there anyway to iterate two hashmaps at the same time without using two loops? Currently, I have a method that accepts two parameters and each parameter value is stored in first and second hashmap. I have to iterate first hash then second to get values. I think there must be a good way to do it but I don't know :( P.S: there could be some errors in above code as this is just an example to explain my problem. Each iterator is a method in original program and accept one parameter. I couldn't copy past real time functions as they are HUGE !

    Read the article

  • persist values/variables from page to page

    - by Todd
    Hello, I'm wondering if there's another solution to my problem, that's considered more the Sharepoint way. FIrstly, my site is an Internet site, not Intranet. The problem is, all I'm trying to do is save values/variables from page to page in Sharepoint. I know the issue with Session Variables, but this seems to be the only way I can see to accomplish this. I know there are webparts that can store this value, but am I wrong in thinking this won't be persisted from page to page? Basically, I'll be extending the Content Query Web Part to dynamically filter it's results based off of a variable/value. The user chooses their 'area' from a dropdown, and the CQWP in the site will change and query results based off of this value (It will be a provincial structure as it is a Canadian site, so if someone chooses the province 'Ontario', this value is saved in a global variable, and these extended CQWP that are throughout the site, will get this value, and query lists flagged as Ontario). Is Session variables the only solution? Thanks everyone!

    Read the article

  • mysql data being inserted twice via php

    - by Jascha
    I can't for the life of me figure out why this function is causing multiple entries into my database table... When I run the function I end up with two records stacked on top of each one second apart here is the function: function generate_signup_token(){ $connection = new DB_Connect(); // <--- my database connection class $ip = mysql_real_escape_string($_SERVER['REMOTE_ADDR']); $sign_up_token = uniqid(mt_rand(), true); $_SESSION['signup_token'] = $sign_up_token; $sign_up_token = mysql_real_escape_string($sign_up_token); $query = "INSERT INTO `token_manager` (`ip_address`, `signup_token`) VALUES ('$ip', '$sign_up_token')"; mysql_query($query); } generate_signup_token();

    Read the article

  • Using awk to return only certain chunks of data

    - by Koriar
    I'm not 100% certain how to phrase my question simply, so I apologize if this has been answered somewhere and I was just unable to find it. What I have are debug logs with authentication packets in them along with a bunch of other output. I need to search through about 2 million lines of logs to find every packet that contains a certain mac address. The packets look something like this (slightly censored): -----------------[ header ]----------------- Event: Authd-Response (1900) Sequence: -54 Timestamp: 1969-12-31 19:30:00 (0) ---------------[ attributes ]--------------- Auth-Result = Auth-Accept Service-Profile-SID = 53 Service-Profile-SID = 49 RADIUS-Access-Accept-Attr/WiMAX-Capability = 0x(numbers) Session-Timeout = 3600 Service-Profile-SID = 4 Service-Profile-SID = 29 Chargeable-User-Identity = "(Numbers)" User-Password = "(the MAC address I'm looking for)" -------------------------------------------- However there are about 10 different possible types with different possible lengths. They all start with the header line and end with the all-dashes line. I've had success using awk to get the code blocks themselves using this: awk '/-----------------\[ header \]-----------------/,/--------------------------------------------/' filename.txt But I was hoping to be able to use it to return only the packets which contain the MAC address that I need. I've been trying to figure this out for a few days now and I'm pretty stuck. I could try and write a bash script, but I could swear that I've used awk to do something like this before...

    Read the article

  • Python - from file to data structure?

    - by Seafoid
    Hi, I have large file comprising ~100,000 lines. Each line corresponds to a cluster and each entry within each line is a reference i.d. for another file (protein structure in this case), e.g. 1hgn 1dju 3nmj 8kfn 9opu 7gfb 4bui I need to read in the file as a list of lists where each line is a sublist, thus preserving the integrity of the cluster, e.g. nested_list = [['1hgn', '1dju', '3nmj', '8kfn'], ['9opu', '7gfb'], ['4bui']] My current code creates a nested list but the entries within each list are a single string and not comma separated. Therefore, I cannot splice the list with indices so easily. Any help greatly appreciated. Thanks, S :-)

    Read the article

  • Check for live Data Source Name Before proceeding

    - by n_kips
    Would it be ok to get a CF app to check for a valid database before proceeding to process that request? This is because there may be instances where the database server may be down or being upgraded, hence an error comes when a db dependant request is made. If there is no connection to the db server, the user can be safely redirected to a safe page. Or can cfcatch work? How can this check be done? Thank you.

    Read the article

< Previous Page | 411 412 413 414 415 416 417 418 419 420 421 422  | Next Page >