Search Results

Search found 83713 results on 3349 pages for 'data change'.

Page 1306/3349 | < Previous Page | 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313  | Next Page >

  • openSuse full disk encryption

    - by djechelon
    I'm a proud Suser. I'm about to reinstall 12.2 on my ASUS N76VZ (UEFI x64 laptop). Since I'm very sensitive about laptop security against theft or unwanted inspection, I chose to use BitLocker with USB dongle in Windows 7. When installing Suse the last time I found that only the home partition (separated from root) was capable of being encrypted. Does Suse offer a full disk encryption solution like BitLocker that I haven't discovered yet? Or is encrypting home partition the only way to protect data? Encrypting only home is feasible as one stores personal data in home, but I still would like to encrypt the whole thing! Also, using a hardware token (no TPM available) for unlocking is preferred to password, if possible! Thanks

    Read the article

  • How do I create a free server-side database to be accessed via Windows forms and/or browser?

    - by NoCatharsis
    I have no formal education in databasing or programming, but I've learned enough SQL, C++, and C# to at least get started setting up a small database on my company's server. Using MS SQL Server 2008 R2, I have created the database and set up columns with proper data types. However, there seems to be a lot of tweaks and details that are way over my head. Since I would like these data to be accessible to the other 7 or 8 people in my office (preferably via web browser), I'm wondering whether this is the best setup for my situation. The other option I've read about is a LAMPP server, which I assume is the competing free option to Microsoft's Express packages. I know nothing of LAMP servers except from the articles I've read on how to set them up (and I believe I even saw a detailed tutorial somewhere). To summarize, my question is this: Which of these (or any other) server setups would best suit my purposes, keeping in mind that I'm a true novice (but willing to learn), and would like to keep it free until I get more experience?

    Read the article

  • How frequent are network partitions on cloud services?

    - by roja
    Much is made of the CAP trade-off for data storage where conflicts can be introduced if there is a network partition. My question is there any evidence that this is a problem that arises with any significant frequency in modern cloud IAAS services e.g.; EC2, Azure, Rackspace. Is it a problem which, despite being a theoretical roadblock in constructing idealised distributed systems is, in fact, a non-issue for all practical concerns? Has anyone experienced a network partition within one of these systems (within a single data-centre?) If so would you be willing to share any details?

    Read the article

  • file copy error from system to cifs mount

    - by dwpriest
    When coping a file greater than 64kB from an Ubuntu server to a CIFS mounted windows share, most of the data is copied, but it seems the last chunk doesn't get copied. The size doesn't match, and the md5 check sums don't match. I have plenty of file space, but then I use cp, I get the following... cp: closing `cloudBackup/asdf.txt': No space left on device Using rsync, I get the following... rsync: close failed on "/home/fluffy/cloudBackup/.asdf.txt.qrBWe6": No space left on device (28) rsync error: error in file IO (code 11) at receiver.c(752) [receiver=3.0.8] rsync: connection unexpectedly closed (29 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(601) [sender=3.0.8] I have full read/write permissions on the mounted share. I can copy via SSH just fine. Any ideas? Thank you

    Read the article

  • How to use symbolic links in windows server 2008R2 across the network (mklink)

    - by server info
    I have One Server (Srv1) which holds data with file shares and the storage is full. Now I have second Server (Srv2) which has alot more space. No I would like to transfer all the data von Serv1 to Serv2 and have links to the new destination. I found mklink very useful here but unfortunately it does not work over the network. Which also points the docu out. People heavily rely on the path's so it would be helpful if somone has a pointer for me... how to handle symbolic links a cross the network with Windows Servers. I am running Windows Server 2008. Thanks for any help

    Read the article

  • 404 when doing safe-upgrade in lucid 64 box?

    - by Millisami
    Why I see 404 when doing sudo aptitude safe-upgrade in my lucid 64 box? deploy@li167-251:~$ sudo aptitude safe-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Reading extended state information Initializing package states... Done The following packages will be upgraded: apache2 apache2-mpm-prefork apache2-threaded-dev apache2-utils apache2.2-bin apache2.2-common apt apt-utils base-files binutils bzip2 dpkg dpkg-dev gzip ifupdown krb5-multidev language-pack-en language-pack-en-base language-selector-common libatk1.0-0 libatk1.0-dev libavahi-client3 libavahi-common-data libavahi-common3 libbz2-1.0 libc-bin libc-dev-bin libc6 libc6-dev libc6-i686 libcups2 libfreetype6 libfreetype6-dev libglib2.0-0 libglib2.0-dev libgssapi-krb5-2 libgssrpc4 libgtk2.0-0 libgtk2.0-common libgtk2.0-dev libk5crypto3 libkadm5clnt-mit7 libkadm5srv-mit7 libkdb5-4 libkrb5-3 libkrb5-dev libkrb5support0 libldap-2.4-2 libldap2-dev libmysqlclient-dev libmysqlclient16 libnotify-dev libnotify1 libpam-modules libpam-runtime libpam0g libparted0debian1 libpng12-0 libpng12-dev libpq-dev libpq5 libssl-dev libssl0.9.8 libtiff4 libudev0 libusb-0.1-4 linux-libc-dev mountall mysql-client mysql-client-5.1 mysql-client-core-5.1 mysql-common mysql-server mysql-server-5.1 mysql-server-core-5.1 openssh-client openssh-server openssl parted python-apt sudo tzdata udev upstart ureadahead wget xulrunner-1.9.2 xulrunner-1.9.2-dev The following packages are RECOMMENDED but will NOT be installed: colibri debhelper fakeroot hicolor-icon-theme libatk1.0-data libglib2.0-data libgtk2.0-bin libhtml-template-perl manpages-dev notification-daemon notify-osd ssl-cert xauth xfce4-notifyd 88 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Need to get 85.8MB of archives. After unpacking 1712kB will be used. Do you want to continue? [Y/n/?] y Writing extended state information... Done Get:1 http://security.ubuntu.com/ubuntu/ lucid-updates/main libpam-modules 1.1.1-2ubuntu5 [358kB] Get:2 http://security.ubuntu.com/ubuntu/ lucid-updates/main base-files 5.0.0ubuntu20.10.04.2 [70.2kB] Get:3 http://security.ubuntu.com/ubuntu/ lucid-updates/main gzip 1.3.12-9ubuntu1.1 [102kB] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc-bin 2.11.1-0ubuntu7.2 404 Not Found [IP: 91.189.88.37 80] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc6 2.11.1-0ubuntu7.2 404 Not Found [IP: 91.189.88.37 80] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc6-i686 2.11.1-0ubuntu7.2 .........

    Read the article

  • how do i determine the image compression algorithm

    - by klijo
    i have a folder containing images. I need to determine the image compression algorithm used in them. Image format is TIFF. Is there a program that i can use to do this ? A program that runs on windows or Linux is ok. When i do a file it gives 100 (2).tif: TIFF image data, little-endian 100.tif: TIFF image data, little-endian It doesnt say which type of algorithm it uses. whether its lossy or lossless and the name of it ?

    Read the article

  • Google Chrome doesn't stay logged in to Google sites when using pinned tabs

    - by Nick T
    Despite checking "stay logged in" or the like on Gmail or Docs, Chrome refuses to do so when I close and re-open it with Google sites pinned. If they're not pinned, it works fine. The "Clear cookies and other site and plug-in data when I close my browser" checkbox in the settings is not checked, and I don't have any cookie exceptions. All settings are defaults. Nor is the incognito mode being used. This occurs on all my computers using Chrome. I have deleted my cookies file (%userprofile%\AppData\Local\Google\Chrome\User Data\Default\Cookies) with no effect (other than losing the logins that ordinarly work fine). Of note is that when I relaunch Chrome with Gmail pinned and it asks me to log in, doing so once will fail (does nothing; no errors), then it will work on the second attempt. If I refresh the window before doing so, it will work on the first attempt.

    Read the article

  • Elastix, how to MOVE files from one server to other server?

    - by yudayyy
    In my office, i have to schedule for moving a file from one computer to other computer (Both are using Elastix). My idea is using cron, scp, and rm to do this. So here are the script that i use: scp -r /home/data/* [email protected]:/home/data1 && rm -r /home/data/* That script did the copy, but not remove the source file. I already read this question: Hov to _MOVE_ files with scp? The problem is, the computer doesn't have an internet connection. So i cannot install rsync on my elastix computer. yum install rsync Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile and then it freezes. Any idea how to do this?

    Read the article

  • ubuntu fails to start

    - by miccaman
    I have a laptop with ubuntu 9.10 which fails to start, and I want to copy the data from it to an external hard disk. I can login in recovery mode command line, but then I cannot mount the external hard drive. (in recovery mode I cannot write to the laptops hard drive) If I boot from an portable USB with mintlinux, I can mount the external harddrive, and copy most of the data from the laptop, however there is a dir which I have no rights to access under /home/user/Documents then I get a permission denied error. Are there any other options?

    Read the article

  • recommendations for efficient offsite remote backup solution of vm's

    - by senorsmile
    I am looking for recommendations for backing up my current 6 vm's(and soon to grow to up to 20). Currently I am running a two node proxmox cluster(which is a debian base using kvm for virtualization with a custom web front end to administer). I have two nearly identical boxes with amd phenom II x4's and asus motherboards. Each has 4 500 GB sata2 hdd's, 1 for the os and other data for the proxmox install, and 3 using mdadm+drbd+lvm to share the 1.5 TB's of storage between the two machines. I mount lvm images to kvm for all of the virtual machines. I currently have the ability to do live transfer from one machine to the other, typically within seconds(it takes about 2 minutes on the largest vm running win2008 with m$ sql server). I am using proxmox's built-in vzdump utility to take snapshots of the vm's and store those on an external harddrive on the network. I then have jungledisk service (using rackspace) to sync the vzdump folder for remote offsite backup. This is all fine and dandy, but it's not very scalable. For one, the backups themselves can take up to a few hours every night. With jungledisk's block level incremental transfers, the sync only transfers a small portion of the data offsite, but that still takes at least a half an hour. The much better solution would of course be something that allows me to instantly take the difference of two time points (say what was written from 6am to 7am), zip it, then send that difference file to the backup server which would instantly transfer to the remote storage on rackspace. I have looked a little into zfs and it's ability to do send/receive. That coupled with a pipe of the data in bzip or something would seem perfect. However, it seems that implementing a nexenta server with zfs would essentially require at least one or two more dedicated storage servers to serve iSCSI block volumes (via zvol's???) to the proxmox servers. I would prefer to keep the setup as minimal as possible (i.e. NOT having separate storage servers) if at all possible. I have also briefly read about zumastor. It looks like it could also do what I want, but it appears to have halted development in 2008. So, zfs, zumastor or other?

    Read the article

  • Minimize VirtualBox Hard Drive disk

    - by Aviv
    I have Ubuntu Server 10.04 TLS installed on a Virtual Machine in a VirtualBox. The size of the Hard Drive is dynamic growing hard drive and the maximum is 32GB. At the beginning i had 4GB on the Hard Drive and the size of the .vdi was 4GB. Lately the size of data on the disk is 15GB but the size of the .vdi is almost 32GB. Why is that? How can i pack / optimize / defrag the HD so it will be the same size of the data on the disk? Thanks.

    Read the article

  • cannot at all find sql instance (while installing an asp.net app on IIS)

    - by giddy
    So I'm really not a DBA, I'm an app dev. I had to install my asp.net mvc3 app on my client's(a large company) IIS6 + Win2k3 machine, with absolutely no help from their sysadmins. The final problem now is SQL Server 2008 r2, after figuring out how to create a login from windows, my app and sqlcmd.exe always complains it cannot find a sql server instance!! I have all the sql services (in services.msc) running to Log On as the local system. I can login fine with SQL Server Management Studio with Windows Auth. I created my database, my asp.net app needs/uses windows auth. But for the love of God, whatever I do my app always complains it cannot find the instance. (Also tried running SQL CMD and it complains of the same thing too!) My data base connection string looks like this: Data Source=machinename\username;Initial Catalog=myDataStore;Integrated Security=True;MultipleActiveResultSets=True Machinename\user is the same thing that shows up on the sql server management studio login if I choose windows authentication right?

    Read the article

  • Why does my excel document have 960,000 empty rows?

    - by C-dizzle
    I have an excel document, Office 2007, on a Windows 7 machine (if that part matters any, I'm not sure but just throwing it out there). It is a list of all employee phone numbers. If I need to generate a new page, I can click on page 2 and the table will automatically generate again. The problem is, someone messed it up since it's on a network drive and now shows I have over 960,000 rows of data, when I really don't! I did CTRL+END to see if any data was in the last cell, so I cleared it out, deleted that row and column, but still didn't fix it. It almost seems like it duplicates itself after the deletion. How can I fix this instead of recreating the entire document?

    Read the article

  • Is rsync --delete safe in case of disk failure

    - by enedene
    I have two data hard drives on my Linux server and I use second as a backup for a first drive. I use rsync for that purpose. An example would be: rsync -r -v --delete /media/disk1/ /media/disk2/ What this does is that it copies every file/directory from /media/disk1/ to /media/disk2/ but also deletes any difference. For example, lets say that files A and B but not file C are on disk1, and on disk2 there is no A and B files, but there is C. The result would be that after the command on disk2 I'd have files A and B, but file C would be deleted, just like on disk1. Now, a rather disastrous scenario had crossed my mind; what if disk1 dies, system continues to work since system files are on my system disk, but when rsync tries to backup my data on disk2 from broken disk1, it deletes all the files from disk2 because it can't read anything on disk1. Is this a possible scenario, or is there a protection from it build in rsync?

    Read the article

  • Enterprise online backup providers

    - by PHLiGHT
    We've used Iron Mountain's LiveVault service but found that it was only good for file level backups. We liked how it backed up every 15 minutes. It doesn't support Exchange 2007-10 and the web interface was very poor. Who else is everyone using? The most notable names in online backup such as Mozy and Carbonite don't really seem suitable for larger companies. We have SQL, Exchange and Sharepoint servers and are looking to virtualize in the near future. Until then bare metal restore capability would be nice. We are currently using Backup Exec 12.5 but that can be so troublesome at times. We have about 2 TB of data. 1TB is archival data.

    Read the article

  • Is there any way of preventing .csv files being converted into excel format

    - by Kevin Trainer
    I'm trying to work with an automated testing tool which can use .csv files as its data sourse. After saving a notepad file containing a number of fields and data seperated by commas as .csv it appears to have been converted to an excel file. When I run the test, only the first line of values is identified and can be run within the automated test. Not sure if this is expected with the testing product (www.badboy.co.au), but just wondered if there was a way of preventing excel from taking control of the .csv file? Any helpfull feedback would be great.

    Read the article

  • Windows folder encryption

    - by Razor
    My situation I know that bitlocker is meant to encrypt whole drives, but I have an hard drive that is already fully partitioned and containing data. I'd like to encrypt part of one partition, leaving the rest of the partition accessible. I would very much like to avoid programs like Norton partition magic (which resize/split partitions), because every time I used them I had problems with the data stored. Question Is there any way/builtin alternative/3rd party app that integrates with windows login to encrypt one subset of a partition? EDIT I heard horror stories about EFS, which is why I don't want to use it, unless there have been improvements on reliability with windows 8. Some highlights from that article: In fact I’ve only used EFS twice in the last ten years on my own computers and on both occasions I’ve lost files and documents. I therefore cannot recommend you ever encrypt your files with this Windows feature. Unfortunately, because of incompatibilities with some differing versions of EFS files can end up scrambled and unrecoverable.

    Read the article

  • Feeding the kernels entropy source from other machines and/or increasing its maximum size

    - by David Spillett
    We have has a little trouble with a small box that acts as a VPN end-point and mail relay for our network, caused by the available entropy for /dev/random being too low (which causes TLS connection attempts by exim to fail). The machine doesn't do anything else, so the normal feed into the entropy pool (interrupt timings from things like disk access) is not enough. As a quick hack I've set a looping script that reads from /dev/hda at a couple of Mbyte/sec which keeps it topped up. Other than buying a hardware RNG, is there a clean way of piping data for entry from elsewhere, such as a copy of the data our file server uses for its entropy source? I've spotted several tips for using rng-tools to feed it from /dev/urandom on the same machine but that "feels dirty". Also, is it possible to increase the maximum pool size? It currently seems to max out at 3585.

    Read the article

  • Optimal Disk Setup for OLTP SQL Server

    - by Chris
    We have a high transaction (lots of reads and writes) database server (running SQL 2005) that is currently set up with a RAID 1 OS partition (C:) and a RAID 5 data/log/tempdb partition (D:). The C: has 2 drives and the D: has 4 drives. The server has around 300 databases ranging from 10MB to 2GB in size. I have been reading up on best practices for partioning the disks, but would like some opinions on our setup since we are so limited in the number of disks. It seems like RAID 10 is popular, but I dont think we could use it with only 6 total disks to work with. Thanks. Update I went with 3 RAID 1 Partitions (2 disks each) Partition 1: OS, TempDB, Backups Partition 2: Logs Partition 3: Data

    Read the article

  • How to restore one contact from Address Book with Time Machine

    - by doekman
    I want to restore one contact from my Address Book with Time Machine. To do so, I select the contact in Address Book. Then, I press the Time Machine icon in the dock. Then my address book is "taken into space". However, when I browse back in time (either pressing the arrow back, or selecting a time on the right), the contact details do not change. And I am sure the data has been changed between dates. Also, when I do press restore, it's still the new data, not the backup. Is this a bug, or am I doing something wrong? I'm using OS X 10.6.3 in combination with a external USB drive on an iMac.

    Read the article

  • Multiple SSL vhosts using wildcard certificate in nginx

    - by vvanscherpenseel
    I have two hostnames sharing the same domain name which I want to serve over HTTPs. I've got a wildcard-SSL certificate and created two vhost configs: Host A listen 127.0.0.1:443 ssl; server_name a.example.com; root /data/httpd/a.example.com; ssl_certificate /etc/ssl/wildcard.cer; ssl_certificate_key /etc/ssl/wildcard.key; Host B listen 127.0.0.1:443 ssl; server_name b.example.com; root /data/httpd/b.example.com; ssl_certificate /etc/ssl/wildcard.cer; ssl_certificate_key /etc/ssl/wildcard.key; However, I get the same vhost served for either hostname.

    Read the article

  • Forgot to unmount/eject external hard drive, lost moved files. Mac OS X

    - by balupton
    So I was using my Mac with my external hard drive connected via USB. I moved about 10 GB of data to it (via drag and drop while holding down the Command key to move the files rather than to copy them). They moved to the drive all right, but as I was having some issues and the Finder crashed after the transfer, I was unable to eject the volume and later everything froze so I had to do a hard restart (hold the power button). When I remounted the volume (plugged the external hard drive back in) it no longer had any of the files which I moved onto it. As it was a lot of data, how can I recover these files?

    Read the article

  • Testing performance from around the world - how do I get a linux shell easily in multiple countries?

    - by Matthew O'Riordan
    We are building a socket based service where latency is paramount, and as such we have servers distributed into 7 data centres around the world. However, whilst we know we're bringing the servers closer to the clients, it's very difficult to know how effective this is, and importantly, what difference this makes compared to our competitors. As such, we want to run simple scripts that test latency and throughput for both our service and our competitors, which is easy enough using Amazon, however Amazon only have 7 data centres. We would like to know for example how we perform in locations all over the world such as South Africa, Australia, China, Peru etc. Does anyone know of any service where we could piggy back off their global infrastructure and run some scripts to test this performance? The obvious contenders are people like Monitis, but I don't think they would allow us to run custom scripts, only standard protocol monitors. Thanks for your help. Matt

    Read the article

  • Date based sum in Excel / Google Docs spreadsheets

    - by alumb
    I have a bunch of rows with a date and a dollar amount (expenses). I want to produce a list of the days of the month and what the balance of the expenses is. So, for example the 5th entry in the list would be 8/5/2008 and the sum of all the expenses that occurred on or before 8/5/2008. Approximately this is =sumif(D4:D30-A5,">0",E4:E30) but of course that doesn't work (where the source data is dates in D4:D30 and the expenses are in E4:E30). Notes source data can't be sorted for various reasons. must work in google spreadsheets, which is a fairly complete subset of excel's functions.

    Read the article

< Previous Page | 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313  | Next Page >