Search Results

Search found 31902 results on 1277 pages for 'sql backup'.

Page 795/1277 | < Previous Page | 791 792 793 794 795 796 797 798 799 800 801 802  | Next Page >

  • Error 2020: Got packet bigger than 'max_allowed_packet' bytes when dumping table

    - by Imagineer
    I'm getting the above mentioned error when backing up with ZRM, which is using mysqldump for backup. mysqldump --opt --extended-insert --single-transaction --create-options --default-character-set=utf8 --user=" " -p --all-databases "/nfs/backup/mysql01/dailyrun/20091216043001/backup.sql" mysqldump: Error 2020: Got packet bigger than 'max_allowed_packet' bytes when dumping table TICKET_ATTACHMENT at row: 2286 I have increased the size for 'max_allowed_packet' to be 1G in /etc/my.cnf which is the server setting and for the client side setting I've set it by running this command: mysql -u -p --max_allowed_packet=1G And I have verified that on the client and server side they are of the same value. This is to check the client side value according to this forum posting http://forums.mysql.com/read.php?35,75794,261640 mysql SELECT @@MAX_ALLOWED_PACKET - ; +----------------------+ | @@MAX_ALLOWED_PACKET | +----------------------+ | 1073741824 | +----------------------+ 1 row in set (0.00 sec) And this is the check the server value setting. mysql SHOW VARIABLES | max_allowed_packet | 1073741824 | I have ran out of ideas, and tried searching within expert exchange and googling for solutions but so far none has worked. Reference http://dev.mysql.com/doc/refman/5.1/en/packet-too-large.html Anyone please advise, thank you.

    Read the article

  • NHibernate with nothing but stored procedures

    - by ChrisB2010
    I'd like to have NHibernate call a stored procedure when ISession.Get is called to fetch an entity by its key instead of using dynamic SQL. We have been using NHibernate and allowing it to generate our SQL for queries and inserts/updates/deletes, but now may have to deploy our application to an environment that requires us to use stored procedures for all database access. We can use sql-insert, sql-update, and sql-delete in our .hbm.xml mapping files for inserts/updates/deletes. Our hql and criteria queries will have to be replaced with stored procedure calls. However, I have not figured out how to force NHibernate to use a custom stored procedure to fetch an entity by its key. I still want to be able to call ISession.Get, as in: using (ISession session = MySessionFactory.OpenSession()) { return session.Get<Customer>(customerId); } and also lazy load objects, but I want NHibernate to call my "GetCustomerById" stored procedure instead of generating the dynamic SQL. Can this be done? Perhaps NHibernate is no longer a fit given this new environment we must support.

    Read the article

  • Prevent Win7 boot loader from taking over the WinXP boot loader

    - by Chris
    My setup: 1 physical hard drive (500gb divided equally into 2 partitions) Windows XP Partition (Current OS) Empty Partition where I will be installing Windows 7 My question is how do I prevent the Windows7 boot loader from taking over my WindowsXP boot loader when installing the new OS ? The reason I am asking is because I already have a ghosted backup of my WinXP partition and if I ever need to restore my xp partition using that backup, would it not overwrite the Windows7 boot loader that was placed in the XP partition with the one from the backup, thus making windows 7 unable to boot. Also what would happen if I decided to delete the Windows XP partition altogether somewhere down the road and along with it the Win7 boot loader that was placed there, wouldn't that cause the system not to boot at all.. To avoid these issues, I simply want to make sure that BOTH the Win7 and WinXP boot loaders are available on their respective partitions and they do not interfere with each other in any way. Is this possible? Thx, Chris

    Read the article

  • How do I associate Parameters to Command objects in ADO with VBScript?

    - by Krashman5k
    I have been working an ADO VBScript that needs to accept parameters and incorporate those parameters in the Query string that gets passed the the database. I keep getting errors when the Record Set Object attempts to open. If I pass a query without parameters, the recordset opens and I can work with the data. When I run the script through a debugger, the command object does not show a value for the parameter object. It seems to me that I am missing something that associates the Command object and Parameter object, but I do not know what. Here is a bit of the VBScript Code: ... 'Open Text file to collect SQL query string' Set fso = CreateObject("Scripting.FileSystemObject") fileName = "C:\SQLFUN\Limits_ADO.sql" Set tso = fso.OpenTextFile(fileName, FORREADING) SQL = tso.ReadAll 'Create ADO instance' connString = "DRIVER={SQL Server};SERVER=myserver;UID=MyName;PWD=notapassword; Database=favoriteDB" Set connection = CreateObject("ADODB.Connection") Set cmd = CreateObject("ADODB.Command") connection.Open connString cmd.ActiveConnection = connection cmd.CommandText = SQL cmd.CommandType = adCmdText Set paramTotals = cmd.CreateParameter With paramTotals .value = "tot%" .Name = "Param1" End With 'The error occurs on the next line' Set recordset = cmd.Execute If recordset.EOF then WScript.Echo "No Data Returned" Else Do Until recordset.EOF WScript.Echo recordset.Fields.Item(0) ' & vbTab & recordset.Fields.Item(1) recordset.MoveNext Loop End If The SQL string that I use is fairly standard except I want to pass a parameter to it. It is something like this: SELECT column1 FROM table1 WHERE column1 IS LIKE ? I understand that ADO should replace the "?" with the parameter value I assign in the script. The problem I am seeing is that the Parameter object shows the correct value, but the command object's parameter field is null according to my debugger.

    Read the article

  • Simultaneous read/write to RAID array slows server to a crawl

    - by Jeff Leyser
    Fairly beefy NFS/SMB server (32GB RAM, 2 Xeon quad cores) with LSI MegaRAID 8888ELP controlling 12 drives configured into 3 different arrays. 5 2TB drives are grouped into a RAID 6 array. As expected, write performance to the array is slow. However, sustained, simultaneous read/write to the array (wether through NFS or done locally) seems to practically block any other access to anything else on the controller. For example, if I do: cp /home/joe/BigFile /home/joe/BigFileCopy where BigFile is 20G, then even a simple ls /home/jane will take many 10s of seconds to complete. In addition, an ls /backup will also take many tens of seconds, even though /backup is a different array on the same controller. As soon as the cp is done, everything is back to normal. cp /home/joe/BigFile /backup/BigFile does not exhibit this behavior. It's only when doing read/write to the same array.

    Read the article

  • How do you localize/internationalize an MVC Controller when using a SQL based localization provider?

    - by EBarr
    Hopefully this isn't too silly of a question. In MVC there appears to be plenty of localization support in the views. Once I get to the controller, however, it becomes murky. Using meta:resourcekey="blah" is out, same with <%$ Resources:PageTitle.Text%. ASP.NET MVC - Localization Helpers -- suggested extensions for the Html helper classes like Resource(this Controller controller, string expression, params object[] args). Similarly, Localize your MVC with ease suggested a slightly different extension like Localize(this System.Web.UI.UserControl control, string resourceKey, params object[] args) None of these approaches works while in a controller. I put together the below function and I'm using the controllers full class name as my VirtualPath. But I'm new to MVC and assume there's a better way. public static string Localize (System.Type theType, string resourceKey, params object[] args) { string resource = (HttpContext.GetLocalResourceObject(theType.FullName, resourceKey) ?? string.Empty).ToString(); return mergeTokens(resource, args); } Thoughts? Comments?

    Read the article

  • How do I make more available space on a Time Machine hard disk?

    - by Daryl Spitzer
    I upgraded my MacBook Pro to Snow Leopard, and made some other changes that have caused my next Time Machine backup to be quite large. Previous to the upgrade my backup drive had filled up, so Time Machine was deleting old backups to make room for new ones. When Time Machine started the first backup after the upgrade, it displayed a message that it was freeing up space. But it wasn't able to free up enough: (The disk has 320 GB capacity.) How can I free up more space on the disk (without reformatting or deleting all the existing backups)? I don't want to recklessly delete files and take the risk of confusing Time Machine.

    Read the article

  • Which Database to use for CMS project in ASP.NET - SQLite or SQL server compact?

    - by srsstr
    Hello there I am working on a CMS project using ASP.Net 3.5/Visual studio 2008.This is the first week of the project and I am working on the design of the system right now. Needless to say that this is my first project of this scale and I have no idea of what I am doing. The requirements of the project ask for a light but functional CMS, one which is easy to deploy.So the question is which database to use in this scenario SQLCE or SQLite? Please Help?

    Read the article

  • Using multiple wifi connections simultaneously on Windows

    - by Salman A
    My office PC has a one wireless network card and there are three available wifi connections: primary, backup and backup of a backup (grin). Is it possible for me to use all three simultaneously. If this results in an increase in bandwidth that's well and good, but primary reason is every now and then one of the network fails and i have to switch back and forth between the available networks by disconnecting, viewing available networks and connecting to next one hoping its running. Do i need more than one network card or a software e.g. a proxy.

    Read the article

  • Using multiple wifi connections simultaneously on Windows

    - by Salman A
    My office PC has a one wireless network card and there are three available wifi connections: primary, backup and backup of a backup (grin). Is it possible for me to use all three simultaneously. If this results in an increase in bandwidth that's well and good, but primary reason is every now and then one of the network fails and i have to switch back and forth between the available networks by disconnecting, viewing available networks and connecting to next one hoping its running. Do i need more than one network card or a software e.g. a proxy.

    Read the article

  • Use API or SQL to detect new Support Tickets?

    - by David Powers
    I currently work for a company that uses Kayako for their support system. They sell an extra program called Insta Alert that plays a sound when a new ticket is submitted. I use WHMCS for my own company, and would like to develop something to work with it that does the same thing. Here is the WHMCS API... http://wiki.whmcs.com/API:Functions I am wondering if it would make more sense from a remote C++ application to use the API or just check the MySQL database for new tickets? This is not really something im overly familiar with (I usually make mods) but it doesn't seem overly difficult. I just want some assistance in choosing the best approach.

    Read the article

  • SQL query: Delete a entry which is not present in a join table?

    - by Mestika
    Hi, I’m going to delete all users which has no subscription but I seem to run into problems each time I try to detect the users. My schemas look like this: Users = {userid, name} Subscriptionoffering = {userid, subscriptionname} Now, what I’m going to do is to delete all users in the user table there has a count of zero in the subscriptionoffering table. Or said in other words: All users which userid is not present in the subscriptionoffering table. I’ve tried with different queries but with no result. I’ve tried to say where user.userid <> subscriptionoffering.userid, but that doesn’t seem to work. Do anyone know how to create the correct query? Thanks Mestika

    Read the article

  • Serialize a C# class to xml. Store the XML as a string in SQL Server and then restore the class late

    - by BrianK
    I want to serialize a class to xml and store that in a field in a database. I can serialize with this: StringWriter sw = new StringWriter(); XmlSerializer xmlser = new XmlSerializer(typeof(MyClass)); xmlser.Serialize(sw, myClassVariable); string s = sw.ToString(); sw.Close(); Thats works, but it has the namesapces in it. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" Will these slow down the deserialization because it will go out to those and verify the XML? I got rid of the namespaces by creating a blank XmlSerializerNamespaces and using that to serialize, but then the xml still had namespaces around integer variables: <anyType xmlns:q1="http://www.w3.org/2001/XMLSchema" d3p1:type="q1:int" xmlns:d3p1="http://www.w3.org/2001/XMLSchema-instance"> 3 </anyType> My question is: Is it necessary to have the namesapces for deserialization and if not, how to get rid of them? How do I tell it fields are ints so it doesnt put in "anytype" Thanks, Brian

    Read the article

  • Variable declared with variable keyword in sql plus(oracle 9i)?

    - by Vineet
    I am trying to declare g_num ,number data type with size it gives an error but in case of varchar2,char it does not. variable g_name varchar2(5);//correct accept size for varchar 2 variable g_num number(23);//Gives an error " VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) | VARCHAR2 (n [CHAR|BYTE]) | NCHAR | NCHAR (n) | NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR ] ]" Please suggest!

    Read the article

  • How to combine a Distance and Keyword SQL query?

    - by Jason
    Hi Folks, I have a tables in my database called "points" and "category". A user will input info into both a location input and a keyword input text box. Then I want to find points in my table where the keyword matches either the "title" field in the points table, or the "category" but are within a certain distance from the user's location. I want to order the results by distance. Here are the 2 queries which btoh work independently: $mysql = "SELECT *, ( 3959 * acos( cos( radians('$search_lat') ) * cos( radians( lat ) ) * cos( radians( longi ) - radians('$search_lng') ) + sin( radians('$search_lat') ) * sin( radians( lat ) ) ) ) AS distance FROM points HAVING distance < '$radius'"; $mysql2 = "SELECT * FROM `points` LEFT JOIN category USING ( category_id ) WHERE (point_title LIKE '%$esc_catsearch%' OR category.title LIKE '%$esc_catsearch%')"; Here is what I tried: $sql_search = sprintf("SELECT *,point_id FROM points WHERE point_title LIKE '%%%s%%' UNION SELECT *, ( 3959 * acos( cos( radians('%s') ) * cos( radians( lat ) ) * cos( radians( longi ) - radians('%s') ) + sin( radians('%s') ) * sin( radians( lat ) ) ) ) AS distance FROM points HAVING distance < '%s' ORDER BY distance LIMIT %d , %d", $esc_catsearch, mysql_real_escape_string($search_lat), mysql_real_escape_string($search_lng), mysql_real_escape_string($search_lat), mysql_real_escape_string($radius), $offset, $rowsPerPage); But it tells me there is no know column "distance". If I remove the "Order By" phrase then it works but I'm still not sure this is giving me the results I want. I also tried the query the other way around with the distance search first but that seems to ignore my keyword. Any thoughts would be much appreciated!

    Read the article

  • Lot of time spent with following waits 'SQL*Net message from client' and 'wait for unread message on

    - by Shravan
    My application that wraps around Oracle Data pump's executables IMPDP and EXPDP takes random amounts of time for the same work. On further investigation, I see it waiting for again random amounts of time with the event 'wait for unread message on broadcast channel'. This makes the application take anytime b/w 10 minutes to over an hour for the same work. I fail to understand if this has something to do with the way my application uses these executables, or it has got something to do with Load on my server or something totally alien to me.

    Read the article

  • SQL, PHP: want to get the collums of a table INFORMATION_SCHEMA gives acces denied --> alternative?

    - by matthy
    hi what i am trying to do is get all the collums of a table (the table can be empty) example of what i did before: SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE table_name = 'aTable' AND table_schema = 'theDatabase' it works perfectly on localhost however on my provider it gives: #1142 - SELECT command denied to user 'username'@'localhost' for table 'COLUMNS' is there an alternative that doesn't use the INFORMATION_SCHEMA??

    Read the article

  • Time capsule on windows 7

    - by Kiva
    Hi guys, I have a time machine to backup my mac book pro. All work fine with it. Now, my girlfriend have a PC on windows 7. She wants to backup her PC with cobian backup on the time machine. But her PC doesn't see the time capsule, so it's impossible to connect it. The Time capsule is connected on my box adsl with wifi and the mac and the pc are connected on the box with wifi. Why windows doesn't see the TC ? I installed "bonjour" on the PC but nothing worked. Thanks for your help.

    Read the article

  • How do I do the SQL equivalent of "DISTINCT" in CouchDB?

    - by Blaine LaFreniere
    I have a bunch of MP3 metadata in couchDB. I want to return every album that is in the MP3 metadata, but no duplicates. A typical document looks like this: { "_id": "005e16a055ba78589695c583fbcdf7e26064df98", "_rev": "2-87aa12c52ee0a406084b09eca6116804", "name": "Fifty-Fifty Clown", "number": 15, "artist": "Cocteau Twins", "bitrate": 320, "album": "Stars and Topsoil: A Collection (1982-1990)", "path": "Cocteau Twins/Stars and Topsoil: A Collection (1982-1990)/15 - Fifty-Fifty Clown.mp3", "year": 0, "genre": "Shoegaze" }

    Read the article

  • How to use OUTPUT statement inside a SQL trigger?

    - by Jeff Meatball Yang
    From MSDN: If a statement that includes an OUTPUT clause is used inside the body of a trigger, table aliases must be used to reference the trigger inserted and deleted tables to avoid duplicating column references with the INSERTED and DELETED tables associated with OUTPUT. Can someone give me an example of this? I'm not sure if this is correct: create trigger MyInterestingTrigger on TransactionalTable123 AFTER insert, update, delete AS declare @batchId table(batchId int) insert SomeBatch (batchDate, employeeId) output SomeBatch.INSERTED.batchId into @batchId insert HistoryTable select b.batchId, i.col1, i.col2, i.col3 from TransactionalTable123.inserted i cross join @batchId b

    Read the article

  • ESXI ftpput fails Syntax problem

    - by Datapimp23
    I'm trying to ftpput my virtual machines dirs to our NAS. Which doesn't support NFS. Only FTP and samba. So I'm in the ESXi console and enter the followin command ftpput ipaddress /vmfs/volumes/4a1157e1-be81171a-1b39-001d09080124/VMNAME /Backup /Backup is a public share on the nas, I can access it through any ftp client. After I enter I get the following ftpput: can't open 'Backup': No such file or directory I'm kind of in the dark here. Any suggestions?

    Read the article

< Previous Page | 791 792 793 794 795 796 797 798 799 800 801 802  | Next Page >