Search Results

Search found 26283 results on 1052 pages for 'temporary table'.

Page 636/1052 | < Previous Page | 632 633 634 635 636 637 638 639 640 641 642 643  | Next Page >

  • New WebKit tests

    I have updated the WebKit comparison table with data from Safari 5, Chrome 5, and Android 2.1. Improvements throughout!The top five WebKit browsers according to these tests are now: Chrome 5 Safari 5 Safari 4 Samsung WebKit (on bada) Android 2.1Interesting findings: Chrome and Android now support localStorage (Safari already did). Chrome and Android now support geolocation. Safari does in theory, but it doesn’t give the actual coordinates, making the whole exercise a bit pointless. Chrome and Android...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to load kernel from live cd on UEFI install of Ubuntu 12.10?

    - by Geezanansa
    Running a GYGABYTE FM1 motherboard which is using a AMD 3870k APU with a new WS Caviar 1TB HDD. Following the advice in the Motherboard manual and https://help.ubuntu.com/community/UEFI have now got to grub screen for UEFI install. The dvd.iso being used is Ubuntu 12.10 desktop amd64. The hdd has had a gpt partition table made for, by using gparted when in a live desktop session(booted in bios mode)but decided to leave it unformatted with the intention of using installer to set up partitions. Booting live dvd gives grub list with the option to "install ubuntu" but get "can not read cd/0" and "the kernel must be loaded first" errors; when that option is selected. Any pointers on how to get installer going for UEFI install would be good. Thanks in advance.

    Read the article

  • Is it viable to become a contract programmer straight out of college?

    - by M G
    I have a Bachelor of Science in Computer Science and four months research experience designing and implementing a research project. I realize this is highly dependent on my skill set - which includes C, C++, Java, Python, and SQL. I feel I have an advantage in two ways: I am young and am not afraid to work overtime. I am willing to take lower pay to gather a client base/experience, and work nights/weekends to get a few projects under my belt. This may be cliche, but I feel that I can learn new technologies quicker than most. At the very least, I am not a slow study. With this being said, is it viable for me to become a contract programmer? Or do I need the 10+ year skill set that most contractors bring to the table?

    Read the article

  • Can Windows Media Player create playlists based on folder structure?

    - by Chaulky
    Over the years I've carefully molded my digital media collection into a series of folders that make it easy for me to find what I'm looking for. I recently discovered the awesomeness that is streaming video from Windows 7 Media Player to the PS3 so I can watch it on the big screen without all the hassle of hooking the computer up to the TV. The problem is, I totally lose my carefully crafted folder structure and all my videos become one giant mess again... not cool! As a temporary solution, I've created a few playlists for my favorites (Dexter Season 4, Dexter Season 5, Breaking Bad Season 1, etc.). This is a HUGE pain in the a$$. So, is there a way to get Windows Media Player (on Windows 7) to maintain some sort of folder structure based on the location of the actual video files? So if I have my videos sorted into folders by show and season, Media Player will pick that up and let me browse it in the same way. As an alternative answer, I'll accept suggestions for a program that can also stream to PS3 and has this "folder organization" feature.

    Read the article

  • Win7 loses connection to network shares after resume unless server specified using FQDN

    - by Szonja Zemkó
    My Win7 client has a connection to a Linux server and its shared folders. The problem occurs when the computer wakes up after a sleep and then one of the shared folders is not accessible. I receive the following message: Error code: 80070035, The network path was not found. I have problem with one specific folder only. When I restart the computer this problematic folder is accessible again. When I log off before sleep the folder is accessible after wakeup. If I try to access the folder by using the FQDN of the server or the server IP it is also accessible. As a temporary solution I mapped the folder to a network drive using the FQDN and it's working fine but it's inconvenient since every other folder is accessible on the server. To summarize: \server\problematicshare no longer works after resume (the Samba server sees my client connect, then disconnects a few seconds later while I receive the above error message) \server\othershare works after resume \fqdn.of.server\problematicshare always works \ip.of.server\problematicshare always works once the problem manifests, I'm no longer able to restart the "Workstation" service (it is not responding) restarting the "Computer Browser" service has no apparent effect the event log doesn't contain anything that seems relevant "ping server" works

    Read the article

  • Multiple ( V- / I- ) Buffers, is it sane?

    - by Techie
    Currently I am developing an RTS game using XNA ( / ANX.Framework ). There is one thing bothers me. I am not sure in what way or how to organise Buffers. Should I use a new Vertexbuffer for any object ( e.g. a Char, a Table, an model ) or is it better to use ONE HUGE/ BIG Buffer to store any geometry in? I am still new to 3D Programming though I finished yet couple games using DirectX 9. Well, I hope this question qin't a duplicate and I appreciate any answer leading me into the right direction.

    Read the article

  • C# and SQL data layer code generator

    I've created a simple yet efficient tool to help generate stored procedures and a C# data access layer from a table.  Instead of using an ORM, this uses standard ADO .NET (SqlConnection, SqlDataReader, etc).  Check it out at www.asteio.com.  It's saved me a ton of time and I'm hoping it does the same for you....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL Server Data Type Precedence

    I am executing a simple query/stored procedure from my application against a large table and it's taking a long time to execute. The column I'm using in my WHERE clause is indexed and it's very selective. The search column is not wrapped in a function so that's not the issue. What could be going wrong? Schedule Azure backupsRed Gate’s Cloud Services makes it simple to create and schedule backups of your SQL Azure databases to Azure blob storage or Amazon S3. Try it for free today.

    Read the article

  • Project Turing: Beginning RIA Services

        Turing Project Page: [Novice: 9 | Advanced: 6 ]    FAQ | Table of Contents | Definitions What is this and where do I start? Reposted with VB.Net code     From Database to DataGrid The next step in Project Turing is to create a first iteration of the Silverlight application that will retrieve data from our database.  Using our technology of [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Question regarding drives

    - by user205934
    I am a new Ubuntu user who has spent a lot of time on Windows. A very common practice for me on Windows was making two drives, C: and D: , storing installs/files in C:, and I used D: for backup or if I downloaded something that I wanted to save, I saved in D: When installing Ubuntu, it asked me if I wanted to replace Windows 7. I thought it would install Ubuntu on C: but instead it used the whole partition, nevertheless I recovered my backup using testdisk. What I wanted to do was to create a similar backup drive on Linux too. My current partition table: sda 8:0 0 232.9G 0 disk +-sda1 8:1 0 230.9G 0 part / +-sda2 8:2 0 1K 0 part +-sda5 8:5 0 2G 0 part [SWAP] sr0 11:0 1 1024M 0 rom So should I use Gparted to create another sda3 and store my important data on that? Also my current sda2 is listed as an extended partition, should I delete it? It's a very small partition, just 1K.

    Read the article

  • Profile not loaded correctly (Cannot access registry)

    - by xaav
    Every so often, I log on and get the Following Message: User profile was not loaded correctly. You have been logged on with a temporary profile. Changes you make to this profile will be lost when you log off. Please see the event log for details or contact your administrator This almost always happens when somebody else has been on the computer for a while, and then I log on. This never used to happen, but now it happens pretty often. My profile is not permanently corrupted, all I have to do is restart my computer, but this annoys me, and I would like to fix it. I was curios about the reason of this cause, so I looked into the Event Log, and found the root of the problem was the ntuser.dat file in the profile that I was logging on to was locked at logon time. This resulted in the current users registry not being loaded, resulting in failure to load the profile. I just found a microsoft article that mentions this exact issue: http://support.microsoft.com/kb/960464/ The problem is that I do not want to delete this profile; and this issue does not come up every time that I log on, only when somebody else has been on a long time before me. What could be locking this file? Is there any way to get a process list without logging on so that I can identify which process has the file locked? Any other suggestions?

    Read the article

  • How can I pull data from PeopleSoft on demand?

    - by trpt4him
    I work in IT at a university and I'm working with about 5 different departments to develop a new process for students to apply to a specific school within the university (not the university as a whole). We're using a web-based college application vendor and adding the applicant questions for the school itself to the main university application. Currently the main application feeds into PeopleSoft. The IT staff here is building a new table to hold just our school's applicant data. I want to be able to access that data from PeopleSoft for use in external applications, but our IT staff doesn't really seem to understand what I'm requesting, as they simply tell me I can have access to the PS query tools. The problem is, I don't want to run just ad hoc queries, I want to be able to connect from outside PeopleSoft and show current data within the external app. I am unable to find documentation or get a clear answer to my question. Does PeopleSoft support access via a web services API or anything similar, and does that sound like the right direction for me to take?

    Read the article

  • Is a blob more efficient than a varchar for data that can be ANY size?

    - by BillyNair
    When setting up a database I want to use the most efficient data type for potentially fairly long data. Currently my project is to store song titles and thoughts pertaining to that song. Some titles might be 5 characters or longer than 100 characters and the thoughts could run pretty long. Is it more efficient to use a varchar set to 8000 or to use a blob? Is using a blob the same as a varchar, in that there is a set size it is allocated regardless of what it holds? or is it just a pointer and it doesn't really use much space on the table? Is there a certain set size of a blob in KB or is it expandable?

    Read the article

  • Organizing your Data Access Layer

    - by nighthawk457
    I am using Entity Framework as my ORM in an ASP.Net application. I have my database already created so ended up generating the entity model from it. What is a good way to organize files/classes in the data access layer. My entity framework model is in a class library and I was planning on adding additional classes per Entity(i.e per database table) and putting all the queries related to those tables in their respective classes. I am not sure if this is a right approach and if it is then where do the queries requiring data from multiple tables go? Am I completely wrong in organizing my files based on entities/tables and should I organize them based on functional areas instead.

    Read the article

  • ASP.NET Pivot Grid Control Supports New Layout - v2010 vol 1

    The ASPxPivotGrid will now support a slick new feature that will help save you screen space: Compact Layout for Hierarchical Row Values With DXperience v2010 vol 1, you can create compact layouts for hierarchical pivot table row values (this capability is also available in the WinForms and WPF versions of this control). The compact layout allows you get more space horizontally without sacrificing the distinction between the hierarchical row values: DXperience? What's That? DXperience is the...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Join multiple consecutive SQLite database dump files into 1 common database? Purpose: Search through ENTIRE Chrome Browsing History

    - by porg
    Google Chrome 's default web browsing history search engine only lets you access the records of the recent 100 days. Nevertheless in your application data, Chrome keeps your entire browsing history in SQLite database files, with the file naming scheme of "History Index YYYY-MM". I am looking for a way to search… …through my entire browsing history, …with sophisticated filters (limit search terms to certain fields such as URL, domain, title, body text; wildcard or regex terms, date ranges). … in … …either some ready-made software. eHistory came close, as it can limit terms to fields, but it lacks wildcards/regexes, and has the same limited time horizon as the default search. Beyond that, I could not find any suited Chrome extension or standalone (Mac) app. …or a command line to join multiple SQLite database files into one database, which I can then query (with the full syntax power). In the spirit of the pseudo code below: Preferred this way: sqlite --targetDatabase ChromeHistoryAll --importFiles /path/to/ChromeAppData/History\ Index* --importOnlyYetUnknownFiles Or if my desired feature --importOnlyYetUnknownFiles is not possible (feature could also be called "avoid duplicate imports by checking UIDs"), then by explicitly only importing files, of which I know, that they have yet not been imported into the ChromeHistoryAll database: cd ChromeAppData; sqlite --databaseTarget ChromeHistoryAll --importFiles YetNotImported1 YetNotImported2 YetNotImported3 All my queries I would then perform in the database "ChromeHistoryAll" P.S.: Additional question of general interest: Is there a way to perform a database query in a temporary database which was created on-the-fly from multiple files? Like: sqlite --query="SQL query" --targetDatabase DbAll --DBtemporaryInRAM --importFiles db1 db2 db3 This is surely not applicable for my Chrome question, as these History Index files have a combined file size of 500MB together, thus such a query would be of bad performance. But it could come handy in other situations.

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • retrieve data based on date range using mysql ,php [on hold]

    - by preethi
    I am working on WPF where I have two datepickers when I try to retrieve the information on date range it displays only one record on all dates(same record displaying multiple times eg : date chosen from 01/10/2013 - 3/10/2013) where I have 3 different records on each day but my output is the first record displayed 3 times with same date and time. function cpWhitelistStats() { $startDate = $_POST['startDate']; $startDateTime = "$startDate 00:00:00"; $endDate = $_POST['endDate']; $endDateTime = "$endDate 23:59:59"; $cpId = $_POST['id']; $cpName = etCommonCpNameById($cpId); print "<h2 style=\"text-align: center;\">Permitted Vehicle Summary</h2>"; print "<h2 style=\"text-align: center;\">for $cpName</h2>"; $tmpDate = explode("/", $startDate); $startYear = $tmpDate[2]; $startMonth= $tmpDate[1]; $startDay = $tmpDate[0]; $tmpDate = explode("/", $endDate); $endYear = $tmpDate[2]; $endMonth= $tmpDate[1]; $endDay = $tmpDate[0]; $startDateTime = "$startYear-$startMonth-$startDay 00:00:00"; $endDateTime = "$endYear-$endMonth-$endDay 23:59:59"; $custId = $_SESSION['customerID']; $realCustomerId = $_SESSION['realCustomerId']; $maxVal = 0; if ($custId != "") { $conn = &newEtConn($custId); // Get the whitelist plates $staticWhitelistArray = etCommonMkWhitelist($conn, $cpId); array_shift($staticWhitelistArray); $startLoopDate = strtotime($startDateTime); $endLoopDate = strtotime($endDateTime); $oneDay = 60 * 60 * 24; // Get the entries $plateList = array_keys($staticWhitelistArray); $plate_lookup = implode('","', $plateList); $sql = "SELECT plate, entry_datetime, exit_datetime FROM stats WHERE plate IN (\"$plate_lookup\") AND entry_datetime > \"$startDateTime\" AND entry_datetime < \"$endDateTime\" AND carpark_id=\"$cpId\" "; $result = $conn->Execute($sql); if (!$result) { print $conn->ErrorMsg(); exit; } $rows = $result->fields; if ($rows != "") { unset($myArray); foreach($result as $values) { $plate = $values['plate']; $new_platelist[] = $plate; $inDateTime = $values['entry_datetime']; $outDateTime = $values['exit_datetime']; $tmp = explode(' ', $inDateTime); $inDate = $tmp[0]; $in_ts = strtotime($inDateTime); $out_ts = strtotime($outDateTime); $duration = $out_ts - $in_ts; $dur_array = intToDateArray($duration); $dur_string = ''; if ($dur_array['days'] > 0) { $dur_string .= $dur_array['days'] . ' days '; } if ($dur_array['hours'] > 0) { $dur_string .= $dur_array['hours'] . ' hours '; } if ($dur_array['mins'] > 0) { $dur_string .= $dur_array['mins'] . ' minutes '; } if ($dur_array['secs'] > 0) { $dur_string .= $dur_array['secs'] . ' secs '; } $myArray[$plate][] = array($inDateTime, $outDateTime, $inDate, $dur_string); } } while ($startLoopDate < $endLoopDate) { $dayString = strftime("%a, %d %B %Y", $startLoopDate); $dayCheck = strftime("%Y-%m-%d", $startLoopDate); print "<h2>$dayString</h2>"; print "<table width=\"100%\">"; print " <tr>"; print " <th>VRM</th>"; print " <th>Permit Group</th>"; print " <th>Entry Time</th>"; print " <th>Exit Time</th>"; print " <th>Duration</th>"; print " </tr>"; foreach($new_platelist as $wlPlate) { if ($myArray[$wlPlate][0][2] == $dayCheck) { print "<tr>"; print "<td>$wlPlate</td>"; if (isset($myArray[$wlPlate])) { print "<td>".$staticWhitelistArray[$wlPlate]['groupname']."</td>"; print "<td>".$myArray[$wlPlate][0][0]."</td>"; print "<td>".$myArray[$wlPlate][0][1]."</td>"; print "<td>".$myArray[$wlPlate][0][3]."</td>"; } else { print "<td>Vehicle Not Seen</td>"; print "<td>Vehicle Not Seen</td>"; print "<td>Vehicle Not Seen</td>"; } print "</tr>"; } } print "</table>"; $startLoopDate = $startLoopDate + $oneDay; } } }

    Read the article

  • 3D Location Handling

    - by tgrosinger
    I am thinking about making a simulator type game that will involve having lots of small objects in a 3D space. What is the typical solution for handling these objects? The first thing that comes to mind is a 3D Array, but I can't help but think there is a more efficient solution. Another idea that comes to mind is objects having possession of smaller items. For example a House possesses a Table which possesses a Cup and Bowl. The final way I can think of handling this is just having an array of "objects" that each have an x, y, z value. While this would make storing them easy I do not understand how you would detect collisions without just looking at every possible object and checking to see if it is in the way. Are there other ways of holding onto these objects that is more efficient?

    Read the article

  • Auth succeeded No requires line available

    - by user286223
    Upgrading to Ubuntu 14.04 (and Apache2 2.4) server I can't use MySQL as htaccess; Before upgrading I was using MySQL as htaccess in the virtual host - worked nicely: <Directory /var/www/html/tilmelding/login> AuthBasicAuthoritative Off AuthUserFile /dev/null # begin auth_mysql configuration AuthMySQL On AuthMySQL_Host localhost AuthMySQL_User ********** AuthMySQL_Password ********* AuthMySQL_DB ********** AuthMySQL_Password_Table user_info AuthMySQL_Username_Field user_name AuthMySQL_Password_Field user_passwd AuthMySQL_Empty_Passwords Off AuthMySQL_Encryption_Types PHP_MD5 AuthMySQL_Authoritative AuthType Basic AuthName "auth_mysql test" Require valid-user After upgrade it didn’t work. In the logfile I got: [:debug] [pid 31333] mod_auth_mysql.c(1578): Constructing password collection query with passfield=[user_passwd], table=[user_info], userfield=[helgoland], where_clause=[] [:debug] [pid 31333] mod_auth_mysql.c(1410): sec->dbh in /var/www/html/tilmelding/login/ is [:debug] [pid 31333] mod_auth_mysql.c(1417): Ordinary query [:debug] [pid 31333] mod_auth_mysql.c(1434): Running query: [SELECT user_passwd FROM user_info WHERE user_name='helgoland'] [:debug] [pid 31333] mod_auth_mysql.c(1522): Checking with PHP_MD5 [:debug] [pid 31333] mod_auth_mysql.c(1524): Auth succeeded [:error] [pid 31333] No requires line available I am able to do the query from CLI and get an MD5 back. What have I missed ?

    Read the article

  • Disk (EXT4) suddenly empty without any sign of why

    - by Ohnomydisk
    I have a Ubuntu 10.04 server with several disks in it. The disks are setup with a union filesystem, which presents them all as one logical /home. A few days ago, one of the disks appears to have suddenly 'become empty', for lack of better explanation. The amount of data on the /home mount almost halved within minutes - the disk appears to have had just over 400 GB of data prior to 'becoming empty'. I have absolutely no idea what happened. I was not using the server at the other time, but there are half a dozen other users who may have been (without root access and without the ability to hose a whole disk). I've ran SMART tests on the disk and it comes back clean. The filesystem checks fine (it has 12 GB used now, as some user software continued downloading after the incident). All I know is that around around midnight on October 19, the disk usage changed dramatically: The data points are every 15 minutes, and the full loss occured between captures: 2012-10-18 23:58:03.399647 - has 953.97/2059.07 GB [46.33 percent] 2012-10-19 00:13:15.909010 - has 515.18/2059.07 GB [25.02 percent] Other than that, I have not much to go off :-( I know that: There's nothing interesting in log files at that time Nobody appeared to be logged in via SSH at the time it occured (most users do not even use SSH) The server was online through whatever occured (3 months uptime) None of the other disks were affected and everything else on the server looks completely normal I have tried using "extundelete" on the disk and it didn't really find anything (some temporary files, but they looked new anyway) I am completely at a loss to what could have caused this. I was initially thinking maybe root escalation exploit, but even if someone did maliciously "rm" the disk contents, it would take more than 15 minutes for 400 GB?

    Read the article

  • How do I fix a permissions problem with MS Distributed File System?

    - by charlesrandall
    I have a computer that is new, Windows 7, that is supposed to have access to particular network resources on a Distributed File System. However, despite all permissions being set correctly, I have consistent trouble accessing them. For instance, I'm supposed to be able to reach \company.org\main\subdir. All the permissions have been granted, only when I try to access it by name, it tells me I don't have permission to access \main. This is where the fun starts. If I ping company.org, get the IP, replace company.org by the IP, I can then access \IP\main\subdir without any problems at all. However we have a ton of scripts and build tools that access the network resource by name. My sysadmin has found that using MS's dfsutil.exe, we can fix it temporary using this sequence of commands: C:\dfsutil.exe /pktinfo C:\dfsutil.exe /PktFlush C:\dfsutil.exe /SpcFlush C:\dfsutil.exe /PurgeMupCache C:\dfsutil.exe /pktinfo After that, everything is great... until I reboot, or until some unspecified time later where suddenly I don't have access to \main\ anymore. Hoping to find a more permanent solution than waiting for it to break and running a batch file.

    Read the article

  • About Leader Board in Javascript (Array) Help Me Please

    - by raulcorrales
    Hi to all. I need help with a score table for my game. -1- I have 4 variables: var Player1Score= 44; var Player2Score= 12; var Player3Score= 45; var Player4Score= 26; --2-- i make a Array: var MyArray=[Player1Score,Player2Score,Player3Score,Player4Score]; --3-- sort the array: MyArray.Sort(); --4-- Print: ----------HIGHSCORES---------- 45 44 26 12 MY QUESTION IS: HOW I CAN PRINT THE NAME OF THE PLAYERS IN ORDER¿? LIKE THIS: ----------HIGHSCORES---------- PLAYER 3 45 PLAYER 1 44 PLAYER 4 26 PLAYER 2 12 THANKS IN ADVANCE. GREETINGS

    Read the article

  • Can I remove the original file while running "sort"?

    - by Spaceman
    I'm sorting a huge file, around 400 gigabytes. I'm running out of the disk space and I must do something quickly. Let's assume the original file is called original_file. So I execute (simplified) it as "sort original_file | gzip -c output_file" I use /home/tmp as a temporary dir. From what I see, there are a lot of intermediate files, like so: tmpA465 tmpB154 ... and so on. The smallest ones have size 12 megabytes. The largest have ~182 megabytes. So, it seems that the "sort" command have already split the original file into small pieces, and have sorted them, and now it is merging them into bigger parts (which will be, eventually, sorted as well). Please correct me if I'm wrong. Can I remove the original file right now without terminating the sort process? I've been waiting for a few days for that and it's important that the "sort" command will not fail and I will get the result file, finally. The OS is Ubuntu server 13.04, x64. Thanks!

    Read the article

  • Using INSERT / OUTPUT in a SQL Server Transaction

    Frequently I find myself in situations where I need to insert records into a table in a set-based operation wrapped inside of a transaction where secondarily, and within the same transaction, I spawn-off subsequent inserts into related tables where I need to pass-in key values that were the outcome of the initial INSERT command. Thanks to a Transact/SQL enhancement in SQL Server, this just became much easier and can be done in a single statement... WITHOUT A TRIGGER! Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

< Previous Page | 632 633 634 635 636 637 638 639 640 641 642 643  | Next Page >