Search Results

Search found 39577 results on 1584 pages for 'temp files'.

Page 389/1584 | < Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >

  • Should I go along with my choice of web hosting company or still search?

    - by Devner
    Hi all, I have been searching for a good website hosting company that can offer me all the services that I need for hosting my PHP & MySQL based website. Now this is a community based website and users will be able to upload pictures, etc. The hosting company that I have in mind, currently lets me do everything... let me use mail(), supports CRON jobs, etc. Of course they are charging about $6/month. Now the only problem with this company is that they have a limit of 50,000 files that can exist within the hosting account at any time. This kind of contradicts their frontpage ad of "UNLIMITED SPACE" on their website. Apart from this, I know of no other reason why I should not go with this hosting company. But my issue is that 50,000 file limit is what I cannot live with, once the users increase in significant number and the files they upload, exceed 50,000 in number. Now since this is a dynamic website and also includes sensitive issues like payments, etc. I am not sure if I should go ahead with this company as I am just starting out and then later switch over to a better hosting company which does not limit me with 50,000 files. If I need to switch over once I host with this company, I will need to take backups of all the files located in my account (jpg, zip, etc.), then upload them to the new host. I am not aware of any tools that can help me in this process. Can you please mention if you know any? I can go ahead with the other companies right now, but their cost is double/triple of the current price and they all sport less features than my current choice. If I pay more, then they are ready to accommodate my higher demands. Unfortunately, the company that I am willing to go with now, does NOT have any other higher/better plans that I can switch to. So that's the really really bad part. So my question(s): Since I am starting out with my website and since the scope of users initially is going to be less/small, should I go ahead with the current choice and then once the demand increases, switch over to a better provider? If yes, how can I transfer my database, especially the jpg files, etc. to the new provider? I don't even know the tools required to backup and restore to another host. (I don't like this idea but still..) Should I go ahead and pay more right now and go with better providers (without knowing if the website is going to do really that well) just for saving myself the trouble of having to take a backup of the 50,000 files and upload to a new host from an old host and just start paying double/triple the price without even knowing if I would receive back the returns as I expected? Backup and Restore in such a bulky numbers is something that I have never done before and hence I am stuck here trying to decide what to do. The price per month is also a considerable factor in my decision. All these web hosting companies say one common thing: It is customers responsibility to backup and restore data and they are not liable for any loss. So no matter what hosting company that I would like to go with, they ask me to take backup via FTP so that I can restore them whenever I want (& it seems to be safer to have the files locally with me). Some are providing tools for backup and some are not and I am not sure how much their backup tools can be trusted considering the disclaimers they have. I have never backed-up and restored 50,000 files from one web host to another, so please, all you experienced people out there, leave your comments and let me know your suggestions so that I can decide. I have spent 2 days fighting with myself trying to decide what to do and finally concluded that this is a double-edged sword and I can't arrive at a satisfactory final decision without involving others suggestions. I believe that someone must be out there who may have had such troublesome decision to make. So all your suggestions to help me make my decision are appreciated. Thank you all.

    Read the article

  • rsync windows to linux permission denied

    - by user64908
    Using Command rsync -avzP --delete --omit-dir-times ../../ [email protected]:/var/www/mysite/ I'm getting rsync: mkstemp "/var/www/mysite/.." failed: Permission denied (13) If ext is in the www-data group should I still set all the files to be owned by user www-data? I am trying to publish the files with rsync and then set the permissions using sudo chown -R www-data doc sudo chgrp -R www-data doc but I can't even rsync because of the permission denied. The SSH works fine, the rsync too except when it tries to write over or update some of the files in /var/www Client * Windows 7 * Cygwin 1.7.16 (GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)) * rsync version 3.0.9 protocol version 30 Server * Ubuntu 12.04 * Apache2 * Root Accounts [ubuntu,ext] * Groups [www-data] * sudo vigr has www-data:x:33:ubuntu,ext I have already configure this http://stackoverflow.com/questions/2124169/cwrsync-ignores-nontsec-on-windows-7 This article has also managed to confuse me http://unix.stackexchange.com/questions/41687/how-should-i-rsync-files-in-var-www-if-i-want-them-to-be-owned-by-www-data What is the right procedure?

    Read the article

  • Understanding Unix Permissions (w/ ACL)

    - by Dr. DOT
    I am trying to set permissions on my server properly. Currently I have a number of directories and files chmod'd at 0777 -- but I am not comfortable with it being this way. So at the advice of a serverfault specialist, I had my hosting provider install ACL on my shared virtual server. When I FTP to the server as my FTP user account "abc", I can do everything I need to do (and rightfully so) because all my dirs and files are owned by "abc", the group is "abc", and the 1st octet is set to 7 (rwx). That much I get. But here's where it gets dark gray for me. PHP is set to user "nobody". so when someone browses on of my web pages that either ends in .php or has some embedded PHP, I assume the last octet controls the access. Because all my dirs and files are owned by "abc" and assigned to group "abc", if the last octet was a 4 (r--) then the server would let the browser read the file. If it were a 6 (rw-) then the server would let the browser also write to the file or directory, correct? what if the web document does not end in .php or does not have any PHP embedded? What is the user then? how can I use ACL to not set the permission to 6 (rw-) or even 7 (rwx)? [not sure what execute does or means] Just looking for some sort of policy settings to best lock down my dirs and files while allowing my PHP scripts to do uploads and write to files (so my users don't call me to tell me "permission denied". Ok, thanks to anyone out there willing to lend me a hand. It is greatly appreciated.

    Read the article

  • Remote file copy util (like rsync) but that will take account of data already copied (in this sessio

    - by Rory McCann
    Let's say I have a directory with 2 files, both are identical and quite large (e.g. 2GB ea.) I want to rsync that directory to a remote host. As I understand it (and I could be wrong), rsync calculates checksums of files. Surely if it sees 2 files with the same checksum it can just copy the first file, then do a local copy on the remote host for the 2nd file? That would make it faster, no? On a similar note, doesn't rsync hash all the remote files before copying? If it saw a different file with the same hash as a file that was to transfered, it could do a local copy on the remote host. Does rsync support this sort of thing? Is there some way to turn it on? Is there a tool similar to rsync that will do this sort of 'hash based' local copies?

    Read the article

  • Episode by episode DVD Rip to AVI on Windows 7.

    - by ProfKaos
    I'm looking for a Windows tool to rip DVD files to AVI files. VidCode looks cool, but it wants to convert the whole DVD (all files in the VIDEO_TS folder) to one AVI. I would like to pull each episode on the DVD to its own AVI. Is their a way to that with VidCode, or another ripper? My Plan B for VidCode is to manually separate files for each episode into their own VIDEO_TS folders. Is this possible? Is there an easy, lazy way to automate this?

    Read the article

  • Untar with date filter

    - by Don
    Is there any way to untar and only extract those files that are above a certain date including directory structure?? I restored a backup on a play server but it was a few days old. However I have a tar archive of the entire structure that is more up to date and healthy so now I want to extract all files (including directory structure) based on a date filter on the files if possible?

    Read the article

  • Windows Search Indexing SSD

    - by user654628
    I just bought a new SSD (Vertex4 from OCZ, 256G) and installed Windows 8 with it on my laptop. I am not using an external hard drive to keep extra data (paging, temp files etc) because I am using a laptop and do not want to carry it around with me. My question is, if I disable Windows indexer (Windows Search service), does Windows still search files under the search (Windows 7 Start menu search/Metro UI Windows 8 search)? Since the indexer was meant to search for files by indexing them, does this mean that Windows will not search new files? Thanks in advance.

    Read the article

  • Proving file creation dates

    - by Nils Munch
    In a weird case surrounding copyrights of a software system I have developed, I use the fact that I have all the source files of the system in question, created long before I joined the company that claims to own the system. The company being sued by yours truely says that I have simply manipulated to files to appear to be from that date. Is it even possible to fake or manipulate creation dates ? And if so, how can I "prove" that the files really are that old ? Luckily, I stored my project on GitHub, whick confirmed the fact that the files are from that era, but that is besides the point. I run purely Apple OS X.

    Read the article

  • How can I specify multiple rules for a particular log file(s) with logrotate?

    - by Ether
    I have a logrotate.d config file that looks something like this: /home/myapp/log/* { daily compress dateext ifempty delaycompress olddir /home/myapp/baklog } There are a few particular log files where I want to apply additional rules, such as "mail". How can I apply additional rules to just some files? If I add another rule above that matches the additional files (e.g. /home/myapp/log/warning.log { ... }, I get an error like error: /etc/logrotate.d/myapp:3 duplicate log entry for /home/myapp/log/warning.log. How can I specify multiple rules that match particular files in an overlapping kind of way?

    Read the article

  • Windows Network copy and access denied randomly

    - by The King
    I have a windows 2008 R2 server and I now installed a new bigger HDD into it. I wanted to copy big AVI files to the new server hdd what is shared on the local network. I have write access to the servers hdd and I can successfully copy smaller files to it. But when I copy bigger files more than 500MB randomly on the copy I get Access Deny message. If I use RDP I can copy files through RDP client. I checked error messages at the server but I didn't found any error about this access deny. Because of RDP copy works I don't think that this could be hardware error. I think this is some kind of software setting error. Someone has faced this kind of error? Or somebody has idea what could cause or how to find the root of the problem?

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • How to copy windows explorer file selections and past filenames as txt

    - by Ricky Williams
    Win7 How can I select multiple files in explorer and get a string of text listing the file names like i would get if i selected the files from within "open file" window of an application. for example if i have a Dir that contains 100 jpegs and i select 01.jpg, 05.jpg, 10.jpg in windows explore via ctrl+clicks or shift+click how can i get a text string that reads " "01.jpg" "05.jpg" "10.jpg" with or without the selected files full path " with out dragging the selected files to another window? I've tried copy and pasting into Note Pad and Word Pad and it either past the picts into the the application or a empty clipboard.

    Read the article

  • Compare-Object gives false differences

    - by Andy
    I have some problem with Compare-Object. My task is to get difference between two directory snapshots made at different times. First snapshot is taken like this: ls -recurse d:\dir | export-clixml dir-20100129.xml Then, later, I get second snapshot and load both of them: $b = (import-clixml dir-20100130.xml) $a = (import-clixml dir-20100129.xml) Next, I'm trying to compare with Compare-Object, like that: diff $a $b What I get is in some places files that were added to $b since $a, but in some -- files that were in both snapshots, and some files, that were added to $b, are not given in Compare-Object output. Puzzling, but $b.count - $a.count is EXACTLY the same as (diff $a $b).count. Why is that? Ok, Compare-Object has -property param. I try to use that: diff -property fullname $a $b And I get the whole mess of differences: it shows me ALL the files. For example, say $a contains: A\1.txt A\2.txt A\3.txt And $b contains: X\2.mp3 X\3.mp3 X\4.mp3 A\1.txt A\2.txt A\3.txt diff output is something like that: X\2.mp3 => A\1.txt <= X\3.mp3 => A\2.txt <= X\4.mp3 => A\3.txt <= A\1.txt => A\2.txt => A\3.txt => Weird. I think I don't understand something crucial about Compare- Object usage, and manuals are scarce... Please, help me to get the DIFFERENCE between two directory snapshots. Thanks in advance UPDATE: I've saved data as plain strings like that: > import-clixml dir-20100129.xml | % { $_.fullname } | out-file -enc utf8 a.txt And results are the same. Here're excerpts of both snapshots (top 100-something lines, a.txt and b.txt), output of compare-object, and output of UNIX diff (unified). All files are UTF-8: http://dl.dropbox.com/u/2873752/compare-object-problem.zip

    Read the article

  • Domain migration - 301 Redirect of all contentes of directory)

    - by Trufa
    Hi, I would like to know if it is possible to do the following considering that I would like to migrate domains. I have lets say: one.com/files/one.html one.com/files/two.php one.com/other/three.html one.com/other/four.doc one.com/other/subdirectory/five.doc I am migrating to two.com So I would like to make RESPECTIVE 301 redirects to the following: two.com/old/files/one.html two.com/old/files/two.php two.com/old/other/three.html two.com/old/other/four.doc two.com/old/other/subdirectory/five.doc I've tried with cPanel and although I come "close" with the redirects option I can't seem to make it happen. The folders are not much (10 -12) the file are a lot, and obviously impossible to make it manually. How would you proceed? Can this/ should this be done with regex from the .htaccess?? Can you direct all the elements of a subdirectory in the manner expressed above? I hope the question is clear enough, if not please ask for any clarification needed!! Thanks in advance!!

    Read the article

  • How can I view a .eml file from command line in Windows Vista?

    - by Nosrettap
    Ok, so my parent's computer crashed (horribly - corruptedRegistry) and I'm trying to access one of their e-mail files that is saved locally on the hard drive. I can't launch Windows itself so right now I am in a "bootup command prompt". I've navigated to where the e-mail appear to be stored C:Users\[userName]\AppData\Local\Microsoft\Windows Mail\Local Folders\Inbox and it shows a list of what appear to be the files. The problem is, they are .eml files and I can't seem to be able to open them. I've tried 'vim' and 'vi' commands but it tells me that 'vim is not recognized as an internal or external command. Does anyone know how I can view .eml files from command line? Thanks

    Read the article

  • Windows - Decrypt encrypted file when user account is destroyed

    - by dc2
    I have a Virtual Machine running on my Windows Server 2008 computer that originally was received by me encryped, as the builder of the VM did it on a MAC, which decrypts files by default. I never thought to decrypt these files, as they automatically 'decrypt' when you have permission over them, so the VM has been running for over a year despite the encryption. I just upgraded my computer to Domain Controller (dcpromo.exe). Now when I try to access/run the VM, I can't because I don't have permission to decrypt the files as that was on another logon (local administrator) and now I am the domain administrator. Apparently the local admin is totally nuked when you upgrade to domain controller. I have tried EVERYTHING - taking ownership of the files, which works. Doesn't do anything for me. Adding full control to everyone on the files. I go to File Properties Advanced Details (under encryption) Users who can access this file. The only user is administrator@localcomputername, and there is a cert number. I try adding a new cert, I don't have permission. I don't have permission to: Decrypt the file (access is denied). Copy the file (to another computer) - access denied. I am totally stumped and this VM is a production machine and needs to get up right now. Does anyone have any ideas?

    Read the article

  • Minimal backup for Windows 7 system recovery [migrated]

    - by JIm
    There might not be an answer to this, but for a home Win7 system, what files/directories must be backed up to recover after a windows crash? I can reinstall software, and I keep data files elsewhere. When I use acronis home backup software to backup my "critical" files it seems to choose the entire partition. Updates are mostly browser cache files and the like. Or, after a crash, should I just reinstall windows. I dread the hours of windows updates that would require. Thanks.

    Read the article

  • How to save transferred MP3 on Centova cast?

    - by cHAKE
    We have uploaded and transferred the mp3 files on centova cast successfully with Filezilla. We know how to update the media library We know how to create playlists We know how to drag the songs into different playlists We know how to save the media or the playlists BUT, we are losing all the media library saved files (songs) the moment we get new files in media library. However we want to keep the media library and playlist saved.

    Read the article

  • Cygwin file and directory user and group

    - by dvanaria
    I use Cygwin as my main development environment on both my home and work computers. In order to share files between the two computers, I use Dropbox, which is installed in the following folder on both computers: c:\cygwin\home\dvanaria\dropbox Everything works great, except for one thing. When I'm working on my home computer and do an ls -l on any directory, all the files show up as owned by dvanaria of group Users. But when I work from my work computer, an ls -l shows all files as being owned by Administrators and of group Domain Users. I know Cygwin uses some kind of mapping between Windows users and permissions to the /etc/passwd file. But to be honest I have no idea how this file works or how it maps to Windows under Cygwin. Could anyone help figure this out? The main problem is that I can't edit any files when using my work computer, only read them.

    Read the article

  • Samba/Winbind issues joing to Active directory domain

    - by Frap
    I'm currently in the process of setting up winbind/samba and getting a few issues. I can test connectivity with wbinfo fine: [root@buildmirror ~]# wbinfo -u hostname username administrator guest krbtgt username [root@buildmirror ~]# wbinfo -a username%password plaintext password authentication succeeded challenge/response password authentication succeeded however when I do a getent I don't get any AD accounts returned [root@buildmirror ~]# getent passwd root:x:0:0:root:/root:/bin/bash bin:x:1:1:bin:/bin:/sbin/nologin daemon:x:2:2:daemon:/sbin:/sbin/nologin adm:x:3:4:adm:/var/adm:/sbin/nologin lp:x:4:7:lp:/var/spool/lpd:/sbin/nologin sync:x:5:0:sync:/sbin:/bin/sync shutdown:x:6:0:shutdown:/sbin:/sbin/shutdown halt:x:7:0:halt:/sbin:/sbin/halt mail:x:8:12:mail:/var/spool/mail:/sbin/nologin uucp:x:10:14:uucp:/var/spool/uucp:/sbin/nologin operator:x:11:0:operator:/root:/sbin/nologin puppet:x:52:52:Puppet:/var/lib/puppet:/sbin/nologin my nsswitch looks like this: passwd: files winbind shadow: files winbind group: files winbind #hosts: db files nisplus nis dns hosts: files dns and I'm definitely joined to the domain: [root@buildmirror ~]# net ads info LDAP server: 192.168.4.4 LDAP server name: pdc.domain.local Realm: domain.local Bind Path: dc=DOMAIN,dc=LOCAL LDAP port: 389 Server time: Sun, 05 Aug 2012 17:11:27 BST KDC server: 192.168.4.4 Server time offset: -1 So what am I missing?

    Read the article

  • Accidental Extract Location - How to Clean Up?

    - by Gordon
    Sometimes I will do a command such as unzip tons_of_files.zip And I will forget to put a -d to point to a subdirectory. This causes the current folder to get filled with tons of files that are intermixed with the existing files. What is the best way to remove all these new files and/or move them to a new directory? I want to avoid having to manually examine the directory and determine if the file was part of the archive or was already present.

    Read the article

  • Can "tar" backup incrementally?

    - by Somebody still uses you MS-DOS
    I have my home folder with a few GB. Is it possible to run tar on it, create a home.tar.gz, and then for changed files, it creates home1.tar.gz only with modified files from previous tar (thus being an incremental backup)? I would like to check the resulting checksum files and export them as well like home.md5, home1.md5, etc. (I know this could be another process, but interesting as well).

    Read the article

  • Seagate 3TB hard drive loses format information

    - by Victor Bugarin
    I have a Windows 7x64 Ultimate, 6 GB memory, 1 TB HD. 3TB Barracuda XT HDD. The HDD is installed on a StarTech 4 bays external enclosure I had troubles so I converted to a GPT, created 1 partition and formatted as NTFS. The hard drive I can write and read to and from the hard drive but it will become unreadable at some point while I am copying files or after I have copied files to it. I have copied large Bluray movies and diverse video files, I have also copied 32 GB of pictures, and I have copied about 86 thousand music files in different formats. At some point the partition becomes unreadable and I have to format the partition again (all files lost) and I have to start the whole process again. At some point I have been unable to copy large ISO (Bluray movies) file images. I have partitioned the HDD in 2 partitions P1 - 2TB, P2 - 1TB and I have lost every single file in either partition the same way. I reformat the HDD and it seems fine. I have run seatools to check the hard drive and it reports to be OK. What gives?

    Read the article

  • EXECUTE master.dbo.xp_delete_file folder permission issue

    - by Alex
    I'm trying to run a Maintenance Cleanup Task to remove .bak files older than 2 days (simple enough). Been trying all variety of .bak, BAK, .*., and editing the path, but the files are still not getting removed even though I receive a "job succeeded" log message. I'm not at the point where I believe it's a folder permission issue. How do I make sure my SA has the proper permissions to remove files from a folder? Thanks.

    Read the article

< Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >