Search Results

Search found 10691 results on 428 pages for 'batch insert'.

Page 325/428 | < Previous Page | 321 322 323 324 325 326 327 328 329 330 331 332  | Next Page >

  • Storing XML in SQL 2005 XMLNS issue

    - by Mike Mengell
    I'm trying to store XML in SQL 2005. I have a very simple table with an Id and a XML column. When the XML contains the attribute xmlns my searching doesn't work. This is my XML; insert into XMLTest (ItemXML) values ( '<MessageType> <ItemId id="ABC" xmlns="ss" /> <Subject>sub</Subject> </MessageType> ') And this is my Query; select itemid, ItemXML.query('(/MessageType/ItemId)') from XMLTest order by ItemId desc If I change the attribute xmlns to anything else my query works. I don't think I know enough about XML to understand what SQL is doing with the namespace. But it must be processing it and storing it differently maybe? Anyone had this issue?

    Read the article

  • How to rate a connect four game situation in java

    - by MrPink
    Hey, I am trying to write a simple AI for a "Get four" game. The basic game principles are done, so I can throw in coins of different color, and they stack on each other and fill a 2D Array and so on and so forth. until now this is what the method looks like: public int insert(int x, int color) //0 = empty, 1=player1 2=player2" X is the horizontal coordinate, as the y coordinate is determined by how many stones are in the array already, I think the idea is obvious. Now the problem is I have to rate specific game situations, so find how many new pairs, triplets and possible 4 in a row I can get in a specific situation to then give each situation a specific value. With these values I can setup a "Game tree" to then decide which move would be best next (later on implementing Alpha-Beta-Pruning). My current problem is that I can't think of an efficient way to implement a rating of the current game situation in a java method. Any ideas would be greatly appreciated! greetings from Germany Mr. Pink

    Read the article

  • Java List Sorting: Is there a way to keep a list permantly sorted automatically like TreeMap?

    - by david
    In Java you can build up an ArrayList with items and then call: Collections.sort(list, comparator); Is there anyway to pass in the Comparator at the time of List creation like you can do with TreeMap? The goal is to be able add an element to the list and instead of having it automatically appended to the end of the list, the list would keep itself sorted based on the Comparator and insert the new element at the index determined by the Comparator. So basically the list might have to re-sort upon every new element added. Is there anyway to achieve this in this way with the Comparator or by some other similar means? Thanks.

    Read the article

  • How to retain focus on an editable html after deleting an element.

    - by Lukasz
    Hello, I have a website with design mode on (aka. content editable = true) with some basic text on it. To that site I hooked up a shortcut so that at any point in the text I can insert an input box that serves me as an autocomplete. For that input however I want it to disappear right after I hit ENTER so that I can continue typing. It is an easy task to just make the input box disappear but I always loose focus from my document. I would greatly appreciate any suggestions on how to make this work?

    Read the article

  • SQL Server Stored Procedure that return processed records number

    - by Ras
    I have a winform application that fires a Stored Procedure which elaborates several records (around 500k). In order to inform the user about how many record have been processed, I would need a SP which returns a value every n records. For example, every 1000 row processed (most are INSERT). Otherwise I would be able only to inform when ALL record are processed. Any hints how to solve this? I thought it could be useful to use a trigger or some scheduled task, but I cannot figure out how to implement it.

    Read the article

  • efficiently trimming postgresql tables

    - by agilefall
    I have about 10 tables with over 2 million records and one with 30 million. I would like to efficiently remove older data from each of these tables. My general algorithm is: create a temp table for each large table and populate it with newer data truncate the original tables copy tmp data back to original tables using: "insert into originaltable (select * from tmp_table)" However, the last step of copying the data back is taking longer than I'd like. I thought about deleting the original tables and making the temp tables "permanent", but I lose constraint/foreign key info. If I delete from the tables directly, it takes much longer. Given that I need to preserve all foreign keys and constraints, are there any faster ways of removing the older data? Thanks.

    Read the article

  • prevent conversion of <br/>

    - by Chris
    Hello, I fear this is a dumb question, but I can't seem to find the answer. Pretty sure what that makes me...... I have C# generated HTML (HtmlGenerator), to which I sometimes want to insert a line break at a certain part of a cell's innertext. Here is how that comes out: <TD >There are lots of extra &lt; br /&gt; words here </TD> This then displays the <br/> as a part of my cell text - not good. Am I missing an easy way to have the <br/> preserved and not converted to &lt, etc...? thanks

    Read the article

  • MS Access vs SQL Server and others ? Is it worth taking a db server when less than 2 Gb and only 20

    - by asksuperuser
    After my experiment with MSAccess vs MySQL which shows MS Access hugely overperforming Mysql odbc insert by a factor 1000% before I would do the same experiment with SQL Server I searched for some other's people and found this one: http://blog.nkadesign.com/2009/access-vs-sql-server-some-stats-part-1/ which says "As a side note, in this particular test, Access offers much better raw performance than SQL Server. In more complex scenarios it’s very likely that Access’ performance would degrade more than SQL Server, but it’s nice to see that Access isn’t a sloth." So is worth bother with some db server when data is less than 2 Gb and users are about 20 (knowing that MS Access theorically supports up to 255 concurrent users though practically it's around a dozen concurrent users only). Are there any real world studies that really compare MS Access with other db in these specific use case ? Because professionaly speaking I keep hearing people systematically recommend DB server from people who have never used Access just because they think DB Server can only perform better in every case which I used to think myself I confess.

    Read the article

  • Processes Allocation in .Net

    - by mayap
    I'm writing some program which should perform calculations concurrently according to inputs which reach the system all the time. I'm considering 2 approaches to allocate "calculation" processes: Allocating processes in the system initialization, insert the ids to Processes table, and each time I want to perform calculation, I will check in the table which process is free. The questions: can I be sure that those processes are only for my use and that the operating system doesn't use them? Not allocating processes in advance. Each time when calculation should be done ask the operating system for free process. I need to know the following inputs from a "calculation" process: When calculation is finished and also if it succeeded or failed If a processes has failed I need to assign the calculation to another process Thanks in advance. Any help would be appreciated.

    Read the article

  • drupal hook just before inserting/updating user info

    - by Senthil
    Hi, I want to hook into drupal's user registration and be able to stop it or allow it to proceed just before database insert. All validations and 3rd party stuff must be done and just before inserting I need to hook in. What is the hook and the operation to do this? I tried 'validate' operation in the hook_user. But, after I check against my logic and let the registration proceed, it should not fail due to some other validation. If I let it proceed, no application logic after that should stop the registration (unless the DB engine fails or something of course). If I stop it, I will set a form error so that nothing happens. How can this be accomplished? P.S. I am using Drupal 6.16

    Read the article

  • SQL Server 2008 vs 2005 udf xml perfomance problem.

    - by user344495
    Ok we have a simple udf that takes a XML integer list and returns a table: CREATE FUNCTION [dbo].[udfParseXmlListOfInt] ( @ItemListXml XML (dbo.xsdListOfInteger) ) RETURNS TABLE AS RETURN ( --- parses the XML and returns it as an int table --- SELECT ListItems.ID.value('.','INT') AS KeyValue FROM @ItemListXml.nodes('//list/item') AS ListItems(ID) ) In a stored procedure we create a temp table using this UDF INSERT INTO @JobTable (JobNumber, JobSchedID, JobBatID, StoreID, CustID, CustDivID, BatchStartDate, BatchEndDate, UnavailableFrom) SELECT JOB.JobNumber, JOB.JobSchedID, ISNULL(JOB.JobBatID,0), STO.StoreID, STO.CustID, ISNULL(STO.CustDivID,0), AVL.StartDate, AVL.EndDate, ISNULL(AVL.StartDate, DATEADD(day, -8, GETDATE())) FROM dbo.udfParseXmlListOfInt(@JobNumberList) TMP INNER JOIN dbo.JobSchedule JOB ON (JOB.JobNumber = TMP.KeyValue) INNER JOIN dbo.Store STO ON (STO.StoreID = JOB.StoreID) INNER JOIN dbo.JobSchedEvent EVT ON (EVT.JobSchedID = JOB.JobSchedID AND EVT.IsPrimary = 1) LEFT OUTER JOIN dbo.Availability AVL ON (AVL.AvailTypID = 5 AND AVL.RowID = JOB.JobBatID) ORDER BY JOB.JobSchedID; For a simple list of 10 JobNumbers in SQL2005 this returns in less than 1 second, in 2008 this run against the exact same data returns in 7 min. This is on a much faster machine with more memory. Any ideas?

    Read the article

  • How should I store Dynamically Changing Data into Server Cache?

    - by Scott
    Hey all, EDIT: Purpose of this Website: Its called Utopiapimp.com. It is a third party utility for a game called utopia-game.com. The site currently has over 12k users to it an I run the site. The game is fully text based and will always remain that. Users copy and paste full pages of text from the game and paste the copied information into my site. I run a series of regular expressions against the pasted data and break it down. I then insert anywhere from 5 values to over 30 values into the DB based on that one paste. I then take those values and run queries against them to display the information back in a VERY simple and easy to understand way. The game is team based and each team has 25 users to it. So each team is a group and each row is ONE users information. The users can update all 25 rows or just one row at a time. I require storing things into cache because the site is very slow doing over 1,000 queries almost every minute. So here is the deal. Imagine I have an excel spreadsheet with 100 columns and 5000 rows. Each row has two unique identifiers. One for the row it self and one to group together 25 rows a piece. There are about 10 columns in the row that will almost never change and the other 90 columns will always be changing. We can say some will even change in a matter of seconds depending on how fast the row is updated. Rows can also be added and deleted from the group, but not from the database. The rows are taken from about 4 queries from the database to show the most recent and updated data from the database. So every time something in the database is updated, I would also like the row to be updated. If a row or a group has not been updated in 12 or so hours, it will be taken out of Cache. Once the user calls the group again via the DB queries. They will be placed into Cache. The above is what I would like. That is the wish. In Reality, I still have all the rows, but the way I store them in Cache is currently broken. I store each row in a class and the class is stored in the Server Cache via a HUGE list. When I go to update/Delete/Insert items in the list or rows, most the time it works, but sometimes it throws errors because the cache has changed. I want to be able to lock down the cache like the database throws a lock on a row more or less. I have DateTime stamps to remove things after 12 hours, but this almost always breaks because other users are updating the same 25 rows in the group or just the cache has changed. This is an example of how I add items to Cache, this one shows I only pull the 10 or so columns that very rarely change. This example all removes rows not updated after 12 hours: DateTime dt = DateTime.UtcNow; if (HttpContext.Current.Cache["GetRows"] != null) { List<RowIdentifiers> pis = (List<RowIdentifiers>)HttpContext.Current.Cache["GetRows"]; var ch = (from xx in pis where xx.groupID == groupID where xx.rowID== rowID select xx).ToList(); if (ch.Count() == 0) { var ck = GetInGroupNotCached(rowID, groupID, dt); //Pulling the group from the DB for (int i = 0; i < ck.Count(); i++) pis.Add(ck[i]); pis.RemoveAll((x) => x.updateDateTime < dt.AddHours(-12)); HttpContext.Current.Cache["GetRows"] = pis; return ck; } else return ch; } else { var pis = GetInGroupNotCached(rowID, groupID, dt);//Pulling the group from the DB HttpContext.Current.Cache["GetRows"] = pis; return pis; } On the last point, I remove items from the cache, so the cache doesn't actually get huge. To re-post the question, Whats a better way of doing this? Maybe and how to put locks on the cache? Can I get better than this? I just want it to stop breaking when removing or adding rows.

    Read the article

  • select from multiple tables with different columns

    - by Qaiser Iftikhar
    Say I got this sql schema. Table Job: id,title, type, is_enabled Table JobFileCopy: job_id,from_path,to_path Table JobFileDelete: job_id, file_path Table JobStartProcess: job_id, file_path, arguments, working_directory There are many other tables with varying number of columns and they all got foreign key job_id which is linked to id in table Job. My questions: Is this the right approach? I don't have requirement to delete anything at anytime. I will require to select and insert mostly. Secondly, what is the best approach to get the list of jobs with relevant details from all the different tables in a single database hit? e.g I would like to select top 20 jobs with details, their details can be in any table (depends on column type in table Job) which I don't know until runtime. Thanks in advance. Regards,

    Read the article

  • MaximumErrorCount has now effect

    - by Rob Bowman
    Hi I'm quite new to SSIS - using 2008 version. I have a job that uses a few data flow tasks. On the third one I'm getting a primary key violation on the last row that it needs to insert, but only sometimes! I'd like to ignore this problem for now and let the job continue. I have set the MaximumErrorCount property to 10 for the DataFlowTaks, the SequenceContainer and for the Package but still taks fails and this causes the package to stop. Could anyone please advise how I can get the package to ignore the error? Thanks Rob.

    Read the article

  • Making Django ignore string literals

    - by James
    UPDATE: It turns out this is a deeper question than I thought at first glance - the issue is that python is replacing the string literals before they ever get to django. I will do more investigating and update this if I find a solution. I'm using django to work with LaTeX templates for report generation, and am running into a lot of problems with the way Django replaces parts of strings. Specficially, I've run into two problems where I try to insert a variable containing latex code. The first was that it would replace HTML characters, such as the less than symbol, with their HTML codes, which are of course gibberish to a LaTeX interpreter. I fixed this by setting the context to never autoescape, like so: c = Context(inputs) c.autoescape = False However, I still have my second issue, which is that Django replaces string literals with their corresponding characers, so a double backslash becomes \, and \b becomes a backspace. How can I force Django to leave these characters in place, so inputs['variable'] = '{\bf this is code} \\' won't get mangled when I use {{variable}} to reference it in the django template?

    Read the article

  • CakePHP Form Helper Database Interaction

    - by xtine
    Currently I am developing a CakePHP that will list various businesses. In the database, there is a table for businesses that lists them like so: id | name | address | city | state | state_id | zip | url The state column are abbreviations of states (for listing purposes) CA, AK, FL, etc and the state_id matches up with the ids in the states table: id | name | state_abbr I have an admin_add.ctp template with form helpers for inserting new businesses. For entering a state for a business I am going to have a pull down that lists all the states. However, how do I make the database insert so it will know how to add the state abbreviation and state id when I submit the form to add the business?

    Read the article

  • help regarding PERL program

    - by riya
    Could someone write simple PERL programs for the following scenarios: 1) convert a list from {1,2,3,4,5,7,9,10,11,12,34} to {1-5,7,9-12,34} 2) to sort a list of negative numbers 3) to insert values to hash array 4) there is a file with content: C1 c2 c3 c4 r1 r2 r3 r4 put it into an hash array where keys = {c1,c2,c3,c4} and values = {r1,r2,r3,r4} 5) There are testcases running each testcase runs as a process and has a process ID. The logs are logged in a logfile process ID appended to each line. Prog to find out if the test case has passed or failed. The program shoud be running till the processes are running and display output.

    Read the article

  • SQL Function for On Balance Volume (Financial Query)

    - by CraigJSte
    I would like to create a function for On Balance Volume (SQL Function). This is too complex of a calculation for met to figure out but here is the outline of the User Defined Table Function. If someone could help me to fill in the blanks I would appreciate it. Craig CREATE FUNCTION [dbo].[GetStdDev3] (@TKR VARCHAR(10)) RETURNS @results TABLE ( dayno SMALLINT IDENTITY(1,1) PRIMARY KEY , [date] DATETIME , [obv] FLOAT ) AS BEGIN DECLARE @rowcount SMALLINT INSERT @results ([date], [obv]) // CREATE A FUNCTION FOR ON BALANCE VOLUME // On Balance Volume is the Summ of Volume for Total Periods // OBV = 1000 at Period = 0 // OBV = OBV Previous + Previous Volume if Close Previous Close // OBV = OBV Previous - Previous Volume if Close < Previous Close // OBV = OBV Previous if Close = Previous Close // The actual Value of OBV is not important so to keep the ratio low we reduce the // Total Value of Tickers by 1/10th or 1/100th // For Value of Volume = Volume * .01 if Volume < 999 // For Value of Volume = Volume * .001 If Volume = 999 FROM Tickers RETURN END This is the Tickers table [dbo].[Tickers]( [ticker] [varchar](10) NULL, [date] [datetime] NULL, [high] [float] NULL, [low] [float] NULL, [open] [float] NULL, [close] [float] NULL, [volume] [float] NULL, [time] [datetime] NULL, [change] [float] NULL )

    Read the article

  • Why do I have to explicitly cast sometimes for varargs?

    - by Daniel Lew
    I've got a Class that uses reflection a lot, so I wrote a method to help out: private <T> T callMethod(String methodName, Class[] parameterTypes, Object[] args) { try { Class c = mVar.getClass(); Method m = c.getMethod(methodName, (Class[]) parameterTypes); return (T) m.invoke(mVar, args); } // Insert exception catching here [...] } This worked well for any method that had parameters, however I had to explicitly cast parameterTypes to Class[] in order for this to work for methods with no parameters (e.g., callMethod('funName', null, null);). I've been trying to figure out why this is the case. It seems to me that if parameterTypes, when null, had no concept of what type it is (Class[]), then I'd need to cast it for getMethod(). But if that's the case, why is getMethod() able to tell the difference between null, and (Class[]) null when the method is invoked?

    Read the article

  • predict location of single-line text from a UITextView

    - by William Jockusch
    Is this possible? Specifically, I have a UITextView, and the text is short enough that it will fit on a single line. I want to predict where it will appear, so that (for example) if I wanted to, I could set up a UILabel that rendered the text in exactly the same location. Once I get that figured out, what I really want to do is pick contentInset and/or contentOffset so that the text of the UITextView and the left-justified UILabel will render in the same location. But I figure the above will let me do that. EDIT: In response to a comment, the fundamental problem I am trying to get around is that UITextField does not let you set the location of the cursor. It appears a lot of people have tried to get around this without success. I need to be able to move the cursor -- inserting/deleting text there is not enough. Control cursor position in UITextField Insert string at cursor position of UITextField Moving the cursor to the beginning of UITextField iOS -- dealing with the inability to set the cursor position in a UITextField

    Read the article

  • How do I make a new div with a specific Id + n?

    - by Noor
    I've got a button and when it's pushed this code launches: var nrDivs = "1" var neuDiv = document.createElement("div"); neuDiv.setAttribute("Id","theDiv" + nrDivs); neuDiv.innerHTML = "Text"; $('#allDivs').append(neuDiv); newfeed.draw(neuDiv); I know that right now the script creates a new div with the ID: theDiv1. What I'm trying to do is when I click the button a second time, it creates a div with the ID: theDiv2. Each time the user presses the button there should be a new div drawn and I'm trying to target that new div and insert dynamic text. Thanks, Noor

    Read the article

  • How can I improve the below query?

    - by Newbie
    I have the following input. INPUT: TableA ID Sentences --- ---------- 1 I am a student 2 Have a nice time guys! What I need to do is to extract the words from the sentence(s) and insert each individual word in another table OUTPUT: SentenceID WordOccurance Word ---------- ------------ ----- 1 1 I 1 2 am 1 3 a 1 4 student 2 1 Have 2 2 a 2 3 nice 2 4 time 2 5 guys! I was able to get the answer by using the below query ;With numCTE As ( Select rn = 1 Union all Select rn+1 from numCTE where rn<1000) select SentenceID=id, WordOccurance=row_number()over(partition by TableA.ID order by rn), Word = substring(' '+sentences+' ', rn+1, charindex(' ',' '+sentences+' ', rn+1)-rn-1) from TableA join numCTE on rn <= len(' '+sentences+' ') where substring(' '+sentences+' ', rn,1) = ' ' order by id, rn How can I improve this query of mine.? Basically I am looking for a better solution than the one presented Thanks

    Read the article

  • Write a row to Google Spreadsheet programmically javascript

    - by Verber
    I'm trying to find a way to insert a row into a google spreadsheet dynamically. I have a list of objects that has all the data for every column in a row. I'm just trying to run a for loop and then send the data to the row in the spread sheet. But the spreadsheet is completely empty. So I don't know if that creates certain problems or not. for(var j=0; j < masterList.length; j++) { //tried this one but it didn't pan out sheet.appendRow([ masterList[j].Date, masterList[j].Name, masterList[j].Bugs, masterList[j].Enhancements, masterList[j].Epic, masterList[j].DevOps, masterList[j].High ]); }

    Read the article

  • Log Shipped but Won't Update

    - by MooCow
    I'm currently taking the MS SQL 2K5 Admin course at a local college and ran into a problem with the Log Shipping part. My setup is the following: Windows 7 x64 SQL 2005 SP3 2 SQL server instances on the same machine Log Shipping settings: Performed full then log back up of Primary Manually restore on Secondary in STANDBY MODE Insert a new record into the table Set up Log Shipping on Primary using SQL Authenication login to connect to the Secondary Set up timers and copy destination on Secondary Monitoring instance not being used I set up a shared folder for WORKGROUP so both instances on the machine can read & write to it. I can see transaction logs generated and copied as defined by the Transaction Shipping wizard. However, the specified table on the Secondary instance is not updating.

    Read the article

  • why single SQL delete statement will cause deadlock?

    - by George2
    Hello everyone, I am using SQL Server 2008 Enterprise. I am wondering why even a single delete statement of this store procedure will cause deadlock if executed by multiple threads at the same time? For the delete statement, Param1 is a column of table FooTable, Param1 is a foreign key of another table (refers to another primary key clustered index column of the other table). There is no index on Param1 itself for table FooTable. FooTable has another column which is used as clustered primary key, but not Param1 column. create PROCEDURE [dbo].[FooProc] ( @Param1 int ,@Param2 int ,@Param3 int ) AS DELETE FooTable WHERE Param1 = @Param1 INSERT INTO FooTable ( Param1 ,Param2 ,Param3 ) VALUES ( @Param1 ,@Param2 ,@Param3 ) DECLARE @ID bigint SET @ID = ISNULL(@@Identity,-1) IF @ID > 0 BEGIN SELECT IdentityStr FROM FooTable WHERE ID = @ID END thanks in advance, George

    Read the article

< Previous Page | 321 322 323 324 325 326 327 328 329 330 331 332  | Next Page >