Search Results

Search found 5779 results on 232 pages for 'backup restoration'.

Page 187/232 | < Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >

  • What mail storage should I choose for our web application; IMAP, key-valud store, rdbms, ...

    - by tvrtko
    I have to store e-mail messages for use with our application. I have "metadata" for all messages inside a relational database, but I don't feel comfortable keeping message content (gigabytes and terabytes of email data) inside a database. I'm currently using IMAP as a storage, but I have my doubts if I choose correctly. First of all there is a problem of uidvalidity and how to keep a permanent reference to message inside IMAP. Second, I'm not sure if this is the most robust solution in terms of backup/restore strategies, corruption of store, replication ... Positive side is that I can query IMAP using the headers because the data is mostly indexed. I don't know if key-value stores are a better approach (Casandra, Tokyo cabinet, redis). How they handle storing 1KB and 50MB of data. How they prevent corruption and when corruption or device failure happens how can I repair the store.

    Read the article

  • svn working copy spanning 2 physical drives?

    - by ronenosity
    I have a large repository hosted on dreamhost that I backup daily on a remote machine using a windows scheduled task which updates the working copy located on an external USB 300GB drive connected to the remote machine. The 300GB drive is nearly full with only 26GB of free space remaining. Recently I added a second USB external 1TB drive to increase storage capacity. I would like to ask: what is the best way to use the new 1TB drive? Is it possible to span the working copy across both drives (ex: somehow create a 1.3 TB drive)? If I copied the working copy from the 300GB to the 1TB would svn continue form do I need to retarget the update working copy script to the new 1TB drive and start downloading everything again (least desired option) thanks

    Read the article

  • Is git suitable for one developer without server

    - by Shawn Mclean
    I am a single developer without another computer to backup my projects on. I'm looking into source controls and I came across git but all the setup tutorials are targeted to an external server. I used to use SourceGear Vault, but seeing that git is getting alot of attention, I might as well familiarize myself with it. I do not always have internet access. Is Git suitable for me? Can I be pointed in the right direction to set it up? Visual Studio 2008. Windows 7.

    Read the article

  • xcodeproj merge fails when adding new group

    - by user1473113
    I'm currently using Xcode with Git, and I'm experiencing some troubles during the merge process of my xcodeproj. Developer1 create a new group in Xcode file arborescence the commit and push. Developer2 on an other computer do the same with an other group name, commit and pull(with merge). The xcodeproj of Developer 2 become unreadable with Xcode. But when I create a new file or just drag and drop files from finder to repository, the merge succeed. Did someone has experienced that kind of trouble? I'm using in .gitattributes: *.pbxproj -crlf -diff merge=union # Better to treat them as binary files. *.pbxuser -crlf -diff -merge *.xib -crlf -diff -merge and in my .gitignore # Mac OS X *.DS_Store *~ # Xcode *.mode1v3 *.mode2v3 *.perspectivev3 *.xcuserstate project.xcworkspace/ xcuserdata/ *.xcodeproj/* !*.xcodeproj/project.pbxproj !*.xcodeproj/*.pbxuser # Generated files *.o *.pyc *.hi #Python modules MANIFEST dist/ build/ # Backup files *~.nib \#*# .#*

    Read the article

  • how to get current date and time in command line

    - by Ieyasu Sawada
    I am using mysqldump to backup mysql database. Now I just need to use the current date and time as file name for the generated sql file. How do I do that if my current code looks like this: mysqldump -u root -p --add-drop-table --create-options --password= onstor >c:\sql.sql I also found this code from this site, but I do not know how to incorporate it in my current code: @echo off For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set mydate=%%c-%%a-%%b) For /f "tokens=1-2 delims=/:" %%a in ('time /t') do (set mytime=%%a%%b) echo %mydate%_%mytime% Please help, thanks:)

    Read the article

  • Need help optimizing a NHibernate criteria query that uses Restrictions.In(..)

    - by Chris F
    I'm trying to figure out if there's a way I can do the following strictly using Criteria and DetachedCriteria via a subquery or some other way that is more optimal. NameGuidDto is nothing more than a lightweight object that has string and Guid properties. public IList<NameGuidDto> GetByManager(Employee manager) { // First, grab all of the Customers where the employee is a backup manager. // Access customers that are primarily managed via manager.ManagedCustomers. // I need this list to pass to Restrictions.In(..) below, but can I do it better? Guid[] customerIds = new Guid[manager.BackedCustomers.Count]; int count = 0; foreach (Customer customer in manager.BackedCustomers) { customerIds[count++] = customer.Id; } ICriteria criteria = Session.CreateCriteria(typeof(Customer)) .Add(Restrictions.Disjunction() .Add(Restrictions.Eq("Manager", manager)) .Add(Restrictions.In("Id", customerIds))) .SetProjection(Projections.ProjectionList() .Add(Projections.Property("Name"), "Name") .Add(Projections.Property("Id"), "Guid")) // Transform results to NameGuidDto criteria.SetResultTransformer(Transformers.AliasToBean(typeof(NameGuidDto))); return criteria.List<NameGuidDto>(); }

    Read the article

  • Accidentally deletion of classes from XCode 3.2.5

    - by Alok Srivastava
    Accidentally my classes folder is deleted with reference from xcode(project). i try to recover them from trash but it was not present in trash. how ever i used svn for backup. but after check out the whole project when i try to run the project then it gives ann error 2012-06-05 09:46:59.651 Lisnx[527:207] Unknown class LisnxAppDelegate in Interface Builder file. 2012-06-05 09:46:59.652 Lisnx[527:207] Unknown class LisnxViewController in Interface Builder file. 2012-06-05 09:46:59.656 Lisnx[527:207] * Terminating app due to uncaught exception 'NSUnknownKeyException', reason: '[ setValue:forUndefinedKey:]: this class is not key value coding-compliant for the key viewController.'

    Read the article

  • Get the newest file from directory structure year/month/date/time

    - by Radek
    I store backups of databases in a directory structure year/month/day/time/backup_name an example would be basics_mini/2012/11/05/012232/RATIONAL.0.db2inst1.NODE0000.20110505004037.001 basics_mini/2012/11/06/012251/RATIONAL.0.db2inst1.NODE0000.20110505003930.001 note that timestamp from the backup file cannot be used. Before the automation testing starts the server time is set to 5.5.2011 So the question is how I can get the latest file if I pass the "base directory" (basics_mini) to some function that I am going to code. My thoughts are that I list the base directory and sort by time to get the year. Then I do the same for month, day and time. I wonder if there is any "easier" solution to that in php.

    Read the article

  • How to use multiple database in a PHP web application?

    - by Harish
    I am making a PHP web Application in which i am using MySQL as database server, i want to make backup of some tables from one database to another database(with that tables in it). i have created two different connection, but the table is not updated. $dbcon1 = mysql_connect(DB_SERVER,DB_USER,DB_PASSWORD) or die(mysql_error()); $dbase1 = mysql_select_db(TEMP_DB_NAME,$dbcon)or die(mysql_error()); $query1=mysql_query("SELECT * FROM emp"); while($row = mysql_fetch_array($query1, MYSQL_NUM)) { $dbcon2 = mysql_connect(DB_SERVER,DB_USER,DB_PASSWORD) or die(mysql_error()); $dbase2 = mysql_select_db(TEMP_DB_NAME2,$dbcon)or die(mysql_error()); mysql_query("INSERT INTO backup_emp VALUES(null,'$row[1]',$row[2])"); mysql_close($dbcon2); } the code above is taking the data of emp from first database, and updataing it into another backup_emp table of another database. the code is not working properly, is there any other way of doing this...please help.

    Read the article

  • What about the Sql transaction log

    - by Michel
    Hi, i always thought that the sql transaction log keeps track of all the transactions done in the database so it could help recovering the database file in case of a unexpected power down or something like that So then, in normal usage, when the data is committed and written to disk, it is cleared because all the data is nice and safe in the mdf file. Seeing the ldf file grow and reading some i understand that that is not the case, and it will keep growing, until: you shrink the log. Only at that point all the commited transactions are cleared and the log file is shrinked. I found some sp's who should do this, but also found the theory that you first have to backup the database? That last step doesn't make sense to me, so can anyone tell me of that is correct and if so, why that is?

    Read the article

  • How to rebase one Git repository onto another one?

    - by kroimon
    Hi there! I had one Git repository (A) which contains the development of a project until a certain point. Then I lost the USB stick this repo A was on. Luckily I had a backup of the latest commit, so I could create a new repository (B) later where I imported the latest project's state and continue development. Now I recovered that lost USB stick, so I have two Git repositories. I think I just have to rebase repo B onto repo A somehow, but I have no idea how to do that, maybe using fetch/pull and rebase? Thanks in advance for your help!

    Read the article

  • Data Warehouse: One Database or many?

    - by drrollins
    At my new company, they keep all data associated with the data warehouse, including import, staging, audit, dimension and fact tables, together in the same physical database. I've been a database developer for a number of years now and this consolidation of function and form seems counter to everything I know. It seems to make security, backup/restore and performance management issues more manually intensive. Is this something that is done in the industry? Are there substantial reasons for doing or not doing it? The platform is Netezza. The size is in terabytes, hundreds of millions of rows. What I'm looking to get from answers to this question is a solid understanding of how right or wrong this path is. From your experience, what are the issues I should be focused on arguing if this is a path that will cause trouble for us down the road. If it is no big deal, then I'd like to know that as well.

    Read the article

  • How to change the default Help browser for VS2010?

    - by Scott Bilas
    Visual Studio 2010 changed the help system to run a little daemon and launch the system default web browser to view it. I'm using Firefox for my system browser but would like to use Chrome for VS help. Is there an option to change the Help browser that I'm not seeing in Tools|Options? If not, is there a workaround or registry setting to do this? As a backup I've been using H3Viewer but I'd like to be able to get context-sensitive F1 help from within the VS IDE.

    Read the article

  • Service Broker error message: Dialog security is unavailable for this conversation because there is

    - by yanigisawa
    I am getting this error in my sys.transmission_queue table whenever I attempt to send a SQL Service Broker message between two different SQL Server servers. (i.e. the databases are on two different physical machines) Dialog security is unavailable for this conversation because there is no security certificate bound to the database principal (Id: 5). Either create a certificate for the principal, or specify ENCRYPTION = OFF when beginning the conversation When this error refers to "database principal" what is it referring to? (the "master" database? dbo user?) I've used the CREATE CERTIFICATE command, backed up the certificate and created a same named certificate on the other server with the backup .cer file from the first server, but I keep getting this message. Any help would be appreciated in getting me pointed in the right direction. I must be missing something obvious. FYI, in my development environment, both the initiating and target databases were on the same physical server, and same SQL instance, and everything was working fine.

    Read the article

  • Known problems with filemtime() on Windows - files getting touched arbitrarily?

    - by Pekka
    Is there a known issue leading to file modification times of cache files on Windows XP SP 3 getting arbitrarily updated, but without any actual change? Is there some service on a standard Windows XP - Backup, Sync, Versioning, Virus scanner - known to touch files? They all have a .txt extension. If there isn't, forget it. Then I'm getting something wrong in my cache routines, and I'll debug my way through. Background: I'm building a simple caching wrapper around a slow web site on a Windows server. I am comparing the filemtime() time stamp to some columns in the data base to determine whether a cached file is stale. I'm having problems using this method because the modification time of the cache files seems to get updated in between operations without me doing anything. THis results in stale files being displayed. I'm the only user on the machine. The operating system is Windows XP, the webserver a XAMPP Apache 2 with PHP 5.2

    Read the article

  • Escaping Problem in bash using isql

    - by latz
    Hi there, I am currently working on a little backup script from some firebird databases and I've come up with a weird escaping problem that I don't seem to be able to solve. Here's the thing in my script I create a variable called sqllog in which I would like to put the output of a chain of commands, here it is. sqllog=echo "SELECT * FROM RDB\$DATABASE;" | isql -u SYSDBA -pass mypasswd localhost:mydatabase | tail -n 2 | head -n 1 | wc -l if I try to execute this in shell I get the following error Statement failed, SQLCODE = -204 Dynamic SQL Error -SQL error code = -204 -Table unknown -RDB -At line 1, column 15. Table unknown RDB means it didn't take my try to escape the $. thx for any help :)

    Read the article

  • Microsoft MyPhone - .NET access possible?

    - by ZombieSheep
    I've searched this as much as I can in the time available, but haven't turned anything up. Does anyone know if it is possible to access data backed up from a WinMo device to Microsoft's MyPhone service programatically, without having to restore all the data back to the device? I'm looking at a way of also keeping a local backup of contacts and SMS messages on my desktop machine, but it seems that if there is an API for doing so, Microsoft haven't advertised it at all. I'm hoping that my inability to find any reference on Google is due to my incompetence rather than it not being supported by Microsoft.

    Read the article

  • Firebug is permanently inactive

    - by bwb
    Firebug used to work fine -- until something happened -- not sure what. FF 3.6.3 on XP PRo current. The net effect is Firebug is now always inactive. The icon is present and gray as is normal. Click on the icon and I get: It does this regardless of the site. I've uninstalled Firebug and re-installed. nada. Additionally, the Tools Firebug menu yields nothing. The only unusual issue I can think of is it may of stopped working sometime after I created an additional FF profile (which has since been removed) for a test. Somehow, in removing the extra profile, my bookmarks were deleted. I recovered them from by my online-backup service. Any suggestions? I'd really like to get the Firebug functionality back. thanks.

    Read the article

  • Making bash script to check connectivity and change connection if necessary. Help me improve it?

    - by cypherpunks
    My connection is flaky, however I have a backup one. I made some bash script to check for connectivity and change connection if the present one is dead. Please help me improve them. The scripts almost works, except for not waiting long enough to receive an IP (it cycles to next step in the until loop too quick). Here goes: #!/bin/bash # Invoke this script with paths to your connection specific scripts, for example # ./gotnet.sh ./connection.sh ./connection2.sh until [ -z "$1" ] # Try different connections until we are online... do if eval "ping -c 1 google.com" then echo "we are online!" && break else $1 # Runs (next) connection-script. echo fi shift done echo # Extra line feed. exit 0 And here is an example of the slave scripts: #!/bin/bash ifconfig wlan0 down ifconfig wlan0 up iwconfig wlan0 key 1234567890 iwconfig wlan0 essid example sleep 1 dhclient -1 -nw wlan0 sleep 3 exit 0

    Read the article

  • Need to copy remotely hosted file via Shell Command

    - by pnm123
    There is a file that hosted remotely on a server that is not supporting Shell Access. I bought a new server that supports Shell Access so now I want to copy a file that is on the non-supporting server to new server via a Shell Command using Putty. File url is like this http://www.domain.com/file.gzip and it is username/password protected. To be more specific, I want to copy a backup of a home directory from cPanel to my new server via Shell command. I have done this few months ago but I don't remember it now and also I failed to Google it.

    Read the article

  • Apply Patch Update

    - by Velu
    Hi, We are the product devlopement company. We have the MSI setup for product it will install the more than 500 assemblies files in customers machine. Once we had released the product there may be some problem in any our of the assemblies due to serveal reason. At the time we will fix those problem and provided the patch update to the customer. [b]Current System:[/b] 1) Fix the problem in the devlopement environment and build the assemblies. 2) Generate the Patch setup using the Inno script with modified assemblies. 3) While installing the patch setup in the customer machine it will backup the old assemblies and replace the modified assemblies. [b]Drawback:[/b] Customer can't able to uninstall the installed patch becoz it just replace the assemblies. Is it possible to have the patch system in MSI like MSP files or else is there any option to uninstall the patch in my current system itself ? Thanks, Velu

    Read the article

  • Click-Once deployment is leaving multiple versions (yes, more than 2)

    - by Clyde
    I've got a click once application that is leaving all old versions on my disk. It's an internal corporate application that gets frequent updates, so this is a disaster for rapidly inflating our backup size. According to the docs and other SO questions, it is supposed to only leave the current and previous versions on disk. However, each time I deploy the project and upgrade a client, I get another copy of all exe/dll/data files. I'm making no changes whatsoever to the application, just pushing deploy again in Visual Studio. Any ideas? Updates: The problem seems to happen on both Windows 7 and XP. 64 bit windows and 32. I've done a diff of the folders where the version is installed and the following files are different: MyApp.exe.manifest MyApp.exe.cdf-ms MyDll1.cdf-ms MyDll2.cdf-ms No actual executable files are different, nor the MyApp.manifest, MyDll1.manifest, etc.

    Read the article

  • Removing a Subversion folder from the client and server

    - by Code Sherpa
    Hi. I have been using Subversion for a few days now and have a question... I have a folder deep in my subversion trunk that I want to remove and replace with another folder. I have read about this on here and tried: Export: I clicked on the folder I wanted to remove I then created a new backup folder elsewhere when prompted I then exported Delete: I next chose the delete option (in TortoiseSVN) on the folder I wanted to remove and clicked it. The folder I want to remove now has an "X" over it as does all of the subfolders and files. But, when I go to the Subversion repository on the remote server, I still see the folder I want to remove and all of its files. What do I have to do to get the clients to forget about this folder and the Subversion server to remove it permanently from its sub-folders? Thanks in advance...

    Read the article

  • .NET platform independant sync framework

    - by Quandary
    Question: I need to synchronize a few ActionScript files from my computer to a network share (backup). I saw a quick fix would be using Microsoft Sync Framework for this, and write a windows service. My problem is I also use Linux, and before I start with MS vendor lockin, is there any sync framework/library/whatever I could use that works accross platform? Or does the MS sync framework work on Linux, too? It is my understanding that it is a wrapper around some com objects, thus it wouldn't work. All I need is synchronizing files. So never mind the database part, although it would be nice to have it, too.

    Read the article

  • SQL Server 2000, how to automate import data from excel

    - by Stan
    Say the source data comes in excel format, below is how I import the data. Converting to csv format via MS Excel Roughly find bad rows/columns by inspecting backup the table that needs to be updated in SQL Query Analyzer truncate the table (may need to drop foreign key constraint as well) import data from the revised csv file in SQL Server Enterprise Manager If there's an error like duplicate columns, I need to check the original csv and remove them I was wondering how to make this procedure more effecient in every step? I have some idea but not complete. For step 2&6, using scripts that can check automatically and print out all error row/column data. So it's easier to remove all errors once. For step 3&5, is there any way to automatically update the table without manually go through the importing steps? Could the community advise, please? Thanks.

    Read the article

< Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >