Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 639/1620 | < Previous Page | 635 636 637 638 639 640 641 642 643 644 645 646  | Next Page >

  • How to disable Spotlight content indexing in Mac OS

    - by o.v.
    From Windows experience, I could always elect Live search to only index file names not their content. Is this something that can be done with Spotlight on a Mac? It used to index absolutely everything, for instance it would return a bunch of video files for any obscure character combination typed into the search field. Right now I've disabled Spotlight entirely as per this answer, but it seems to have disabled searching altogether. For instance, Finder is yet to locate any .pdf files in a small directory as I'm typing this question (unlike windows search which would still be able to work even with indexing disabled) Alternatively, if there is any way (including a trusted third-party app) that will index file names and metadata e.g. ID3 tags that would likely be the preferred option.

    Read the article

  • Accidentally deleted the software for MyPassport Essential SE 1TB Hardrive

    - by user26192
    I'm posting for a friend of mine. She bought a WD MyPassport Essential SE 1 TB Hard drive the other day. When she plugged in the USB in her lap top, the driver cannot be recognized by the smart ware software. While she was doing a back up of her files, McAfee was running in the background. Since the backup was taking so long to finish, she decided to pause it. She tried to delete the partially backed up files, but instead, she accidentally deleted the entire file in the folder including the pre-installed software. Now, when she tries to start up the MyPassport, the smart ware doesn't show up anymore. Can someone please give us advice what can she do about this? Thank you.

    Read the article

  • How can I make my Ubuntu server accessible to the internet ?

    - by wahid
    Hi, I already installed applications to make my server web server. when I type the DHCP released ip address in the web browser, i can access it but all it says is "it works....etc". I can copy files to /var/www successfuly using WINSCP but yet, i can not see any files when I connect to it using my windows machine in the browser. Secondly , I tried to forward port on my home SMC router, it only accepts local lan ip which my ubuntu server picks up internet ip from router...what should i do ? can you help please ???? Thanks,

    Read the article

  • Windows 2008 Group Policy Setting? - Migration Headache

    - by DevNULL
    I have a small domain of users that I just migrated from a linux domain running open-ldap. Our new servers are running Windows 2008 Standard. I've installed Active Directory and everything is working perfectly... except that the initial user privileges is pretty restrictive and I need to loosen it up a bit. For example once they login to their workstations, they can create new files and folders but can not modify existing files or start. I basically want to open it all up except for software installations. Can someone please help with with this migration headache?

    Read the article

  • Is there an image viewer which *won't* show every file in the current folder?

    - by hawbsl
    Looking for a Windows image viewer which can be started from the command line but which allows me to specify/restrict which files I want it to page through. As parameters. The good 'ol Windows Picture Viewer would be fine except that it'll show/cycle through all the pictures in the current folder. In my case I want to say something like: someimgvwr.exe "cat.jpg" "cow.jpg" "cub.jpg" so that only those three files will be displayed and not "pig.jpg" which might also happen to be there in the same folder. Actually, if it allowed something like this: someimgvwr.exe "c*.jpg" that would be even more perfect. Do any of the many image viewers that are out there allow such a thing?

    Read the article

  • Using Windows Azure storage for backup

    - by Bruno
    I am currently looking at Windows Azure blobs as an option for backing up archive data. I want to be able to upload files from an external windows machine via the internet but I don't know enough about Windows Azure storage to make a decision. Some of the questions I have are How do I upload the files. Is there a client application, can I use robocopy? Would it be fast enough? i.e. Could I download or upload 1TB of data in a week? Is it secure? Hopefully someone smarter than me can help me :-)

    Read the article

  • Best Practice: Migrating Email Boxes (maildir format)

    - by GruffTech
    So here's the situation. I've got about 20,000 maildir email accounts chewing up a several hundred GB of space on our email server. Maildir by nature keeps thousands of tiny a** little files, instead of one .mbox file or the like... So i need to migrate all of these several millions of files from one server to the other, for both space and life-cycle reasons. the conventional methods i would use all work just fine. rsync is the option that comes immediately to mind, however i wanted to see if there are any other "better" options out there. Rsync not handling multi-threaded transfers in this situation sucks because it never actually gets up to speed and saturates my network connection, because of this the transfer from one server to another will take hours beyond hours, when it shouldn't really take more then one or two. I know this is highly opinionated and subjective and will therefore be marked community wiki.

    Read the article

  • How should I organize my backups ?

    - by Patrick
    I'm using for the first time rsync to create daily backups of my websites and I was wondering if I should overwrite the previous copy or should I create multiple copies and overwrite only the oldest one ? (I might not have enough space for that, though). I actually have also this question. Let's suppose most of files are accidentally erases.. does rsync delete all these files from the backup space because they don't exist anymore ? How does exactly work in this case ? thanks

    Read the article

  • Updating wordpress in a multi-node environment

    - by Peter
    I'm finding this very tricky in a multi node environment, with code under revision control. AKA. multiple frontends and single database. I have a deployment process that pushes a git repo to the servers, but obviously if I update Wordpress from within the admin panel, it will update the files to one FE. Then I would need to copy over the new files to the other FE nodes. Plus, whenever these changes are written when Wordpress updates on a node, it writes code into the git repo. As such, it then breaks the auto deploys that perform 'git pulls', as it then has untracked changes and refuses to pull in new deploys unless manually intervened. How does one easily keep Wordpress updated in a multi node (load balanced) environment?

    Read the article

  • using "touch" to create directories?

    - by user66732
    1) in the "A" directory: find . -type f a.txt 2) in the "B" directory: cat a.txt | while read FILENAMES; do touch "$FILENAMES"; done 3) Result: the 2) "creates the files" [i mean only with the same filename, but with 0 Byte size] ok. But if there are subdirs in the "A" directory, then the 2) can't create the files in the subdir, because there are no directories in it. Question: is there a way, that "touch" can create directories?

    Read the article

  • using "touch" to create directories?

    - by user62367
    1) in the "A" directory: find . -type f a.txt 2) in the "B" directory: cat a.txt | while read FILENAMES; do touch "$FILENAMES"; done 3) Result: the 2) "creates the files" [i mean only with the same filename, but with 0 Byte size] ok. But if there are subdirs in the "A" directory, then the 2) can't create the files in the subdir, because there are no directories in it. Question: is there a way, that "touch" can create directories?

    Read the article

  • Modifying Windows Shortcut .lnk file

    - by user13267
    here as it's locked over there; I hope it belongs in this forum; Is it possible to open .lnk (windows shortcut) in a hex editor, and change the absolut shortcut path into a relative one? Can we do this is windows? The edit command in cmd can open link files but it's difficult to read and edit. Is there any hex editor in windows that can open lnk file and allow me to edit it? Do I need to take it to linux or does Linux recognize .lnk as a shortcut too? I want to do this not only for running exe files, but also for pointing to folders (that is, shortcut to folders). A batch file which executes explorer.exe with the target folder as parameter can do this actually, but I want to know if there is any way to actually edit the data in the .lnk file itself

    Read the article

  • Upload large database SQL file

    - by Devy
    I've a database of more than 20Gb of size on my hard disk. What is the best way to upload it with the least (money) load possible on the server? - I'm on Windows 7. - I have FTP and SSH access on the server. I avoid using FTP because my connection cuts off a lot, I can't imagine I re-upload again the file after failing on 99%. I found some tools that split the large .sql file to small .sql files, but they didn't mention how to gather these files again into one file. Another way is to archive the big .sql file to .rar with -v option, upload them through FTP then unpack them. But unpacking will also cost, right? I know it will cost in any cases, but any best practice will be strongly appreciated.

    Read the article

  • Improving Windows Authentication performance on IIS

    - by flalar
    We're struggling with performance issues with a ASP.NET MVC site that is using Windows Authentication. Response time is very slow on the first request to the site when the user is being authenticated. Further, every time the Authorization header is sent from the browser the response time increases with many seconds. The same issue occurs for both executed files and static content like CSS and JS. Access to the application is restricted to users within a certain role and we are now planning to allow access to static files for all authenticated users to see if that helps. The authentication method in use is NTLM. How should we go forward in pinpointing why authentication decreases performance drastically?

    Read the article

  • Virtualhosts - best way of dealing with it?

    - by axqe56
    I'm competent at the basics of Apache, PHP and virtual hosting but have a question about virtual hosting. As far as I'm aware, HOSTS files can only be in one of the following locations: C:/Windows/system32/drivers/etc (varies in older installs, I believe) I don't think it can be put elsewhere for use with Apache, simply for virtual hosts, and the main HOSTS file for blocking sites etc. I heard about PAC files on Uniform Server's website (http://wiki.uniformserver.com/index.php/Virtual_Hosting:_PAC) but they're browser-specific though, aren't they? What's the best way to deal with virtualhosts, other than HOSTS file? My server isn't currently open to the internet yet, but if it is, what's the best way to resolve DNS for my virtualhost domains if it were to become forward-facing (i.e open to the internet)?

    Read the article

  • Linux/Apache performance very slow even on local network

    - by klausch
    I have an Ubuntu server machine running Apache and MYSQL. System and version info is as follows: Linux kernel 3.0.0.-12 Apache/2.2.20 MySQL Ver 14.14.Distrib 5.1.58 I am running a few websites on this server, some HTML only, some PHP/MySQL. THe [problem is that response time is very slow, both on static as well as the dynamic sites. Sometimes it takes more than 10 seconds before a response is given, this makes the sites very slow and almost unusable. The problem occurs even when requesting from the local network. I have added the involved subdomains to my /etc/hosts file, and abolve all the problem is not solved by using IP numbers instead of URL's. So there is no DNS lookup issue. I have modified the log format by showing the response times and sometimes a files takes 12 seconds to be served, see the jquery~.js file in the example screenshot. I have no explanation for this extremely long response time, but is is not even the only issue here, some other files takes a long time to be served too, but do not show a long response time in the log file. So probably different tissues are involved here. I cannot find a solution until now, any suggestions??? THanx in advance, Klaas link to screenshot picture from access logfile Some extra configuration info: apache2.conf (comment is removed) LockFile ${APACHE_LOCK_DIR}/accept.lock PidFile ${APACHE_PID_FILE} Timeout 300 KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 5 <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_worker_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_event_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> User ${APACHE_RUN_USER} Group ${APACHE_RUN_GROUP} AccessFileName .htaccess <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy all </Files> DefaultType text/plain HostnameLookups Off ErrorLog ${APACHE_LOG_DIR}/error.log LogLevel warn Include mods-enabled/*.load Include mods-enabled/*.conf Include httpd.conf Include ports.conf LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" vhost_combined LogFormat "%h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\" %T/%D" combined LogFormat "%h %l %u %t \"%r\" %>s %O" common LogFormat "%{Referer}i -> %U" referer LogFormat "%{User-agent}i" agent Include conf.d/ Include sites-enabled/ And the virtual hostfile for one of the slow sites, in fact it is pretty straightforward... <VirtualHost *:80> ServerAdmin [email protected] ServerSignature EMail ServerName toenjoy.drsklaus.nl DocumentRoot /var/www/toenjoy.drsklaus.nl <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/toenjoy.drsklaus.nl/> Options Indexes FollowSymLinks MultiViews AllowOverride AuthConfig AuthType Basic AuthName "To Enjoy" AuthUserFile /etc/.htpasswd Require user petraaa Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> And the output of free -m: klaas@ubuntu-server:/etc/apache2$ free -m total used free shared buffers cached Mem: 1997 1401 595 0 144 1017 -/+ buffers/cache: 238 1758 Swap: 2035 0 2035 and I have no indication that swapping occurs on the moments the site is slow. I have runned top and it does not appear to be a CPU issue. I have the impression that the spawning of a apache thread could maybe be the bottleneck but it is just a suggestion. Maybe this gives some extra information! EDIT: The problem seemed to be gone for some time but occurs again! And not only with Apache, also connecting using SSH takes a tremendous time, sometimes it takes up to 15 seconds before the keyphrase is asked for. Also scp works very slowly. The behavious is really unpredoctable and makes the server very hard to use. Any ideas...?

    Read the article

  • Steps to install solely ubuntu 13.04 on Dell inspiron 14z ultrabook with SSD+HDD

    - by rishy
    I have tried a few things like disabling the Intel smart response, choosing AHCI in BIOS. But there are certain problems I am still facing. I can't see my SSD during the installation of ubuntu (I am planning to install Ubuntu on my SSD and other files on HDD). When I run Ubuntu my laptop gets overheated and battery backup reduces to 90 minutes. (I guess it's related to my graphic driver ATI Raedon HD 7570). Cooling fan seems to run at its fullest, it was working much better in windows. So, overall I wanted to know what are the exact steps I need to follow to install Ubuntu on my SSD and then use my HDD to keep other files, How can I get rid of overheating and battery backup problem?

    Read the article

  • JRE not running correctly on Windows 7 64Bit

    - by dkt91
    An application called DBGL (DosBox Game Launcher) requires JRE in order to run successfully. I had it installed and running on my old Windows 7 32Bit machine without any problems. Today I got my new PC with Windows 64Bit and I cannot get it running anymore. If I install the latest JRE 32Bit version to the \Program Files (x86) folder DBGL says it needs the latest JRE version in order to run and doesn't start. If I install the latest JRE 64Bit version to the \Program Files folder I do not get an error message anymore but the application doesn't start either. Clicking it has literally no effect. Right now I have both JRE versions running parallely and I get the same result I have when only the 64Bit version is installed. Thanks in advance!

    Read the article

  • Changing Word mail merge data source locations in bulk?

    - by Daft Viking
    I've just moved a number of Word mail merge files, and a number of Excel spreadsheets that are the data sources for the mail merges, from a Windows XP computer to a Windows 7 computer, and now all the paths for the merge sources are incorrect (used to be c:\documents and settings\user\my documents.... now c:\users\documents....). While I can correct the path of the data source in each file individually, I was hoping that there would be some way of updating the files in bulk, as there are a relatively large number of them. Word 2007 is what is being used, but the documents are all in the previous DOC format (not DOCX).

    Read the article

  • Trying to grok Linux quotas, where is the data stored?

    - by CarpeNoctem
    So all the tutorials and documentation for the Linux quota system has left me confused. For each filesystem with quotas enabled/on where is the actual quota information stored? Is it filesystem metadata or is it in a file? Say user foo creates a new file on /home. How does the kernel determine whether user foo is below their hard limit? Does the kernel have to tally up quota information on that filesystem each time or is it in the superblock or somewhere else? As far as I understand, the kernel consults the aquota.user file for the actual rules, but where is the current quota usage data stored? Can this be viewed with any tools outside repquota and the like? TIA!! Update: Thanks for the help. I had already read that mini-HOWTO. I am pretty clear on the usage of the user space tools. What I was unclear on is whether the usage data was ALSO in the file that stored per-user limits and you answered this with a yes. From what I can tell, rc.sysinit runs quotacheck and quotaon on startup. The quotacheck program analyzes the filesystem, updates the aquota.* files. It then makes use of quota.h and the quotactl() syscall to inform the kernel of quota info. From this point forward the kernel hashes that information and increments/decrements quota stats as changes occur. Upon shutdown, the init.d/halt script runs the quotaoff command RIGHT before the filesystems are unmounted. The quotaoff command does not appear to update the aquota.* files with the information the kernel has in memory. I say this because the {a,c,m}times for the aquota.user file are only updated upon a reboot of the system or by manual running the quotacheck command. It appears - as far as I can tell - that the kernel just drops it's up-to-date usage data on the floor at shutdown. This information is never used to update the aquota.* files. They are updated during startup by quotacheck(rc.sysinit). Seems silly to me since that updated info had already been collected by the kernel. So...in conclusion I am still not entirely clear on the methods. ;)

    Read the article

  • git private server error: "Permission denied (publickey)."

    - by goddfree
    I followed the instructions here in order to set up a private git server on my Amazon EC2 instance. However, I am having problems when trying to SSH into the git account. Specifically, I get the error "Permission denied (publickey)." Here are the permissions of my files/folders on the EC2 server: drwx------ 4 git git 4096 Aug 13 19:52 /home/git/ drwx------ 2 git git 4096 Aug 13 19:52 /home/git/.ssh -rw------- 1 git git 400 Aug 13 19:51 /home/git/.ssh/authorized_keys Here are the permissions of my files/folders on my own computer: drwx------ 5 CYT staff 170 Aug 13 14:51 .ssh -rw------- 1 CYT staff 1679 Aug 13 13:53 .ssh/id_rsa -rw-r--r-- 1 CYT staff 400 Aug 13 13:53 .ssh/id_rsa.pub -rw-r--r-- 1 CYT staff 1585 Aug 13 13:53 .ssh/known_hosts When checking my logs in /var/log/secure, I used to get the following error message every time I tried to SSH: Authentication refused: bad ownership or modes for file /home/git/.ssh/authorized_keys However, after making a few permission changes, I no longer get this error message. Despite this, I am still getting the "Permission denied (publickey)." message every time I try to SSH. The command I am using to SSH is ssh -T git@my-ip. Here is the full log I get when I run ssh -vT [email protected]: OpenSSH_6.2p2, OSSLShim 0.9.8r 8 Dec 2011 debug1: Reading configuration data /etc/ssh_config debug1: /etc/ssh_config line 20: Applying options for * debug1: Connecting to my-ip [my-ip] port 22. debug1: Connection established. debug1: identity file /Users/CYT/.ssh/id_rsa type -1 debug1: identity file /Users/CYT/.ssh/id_rsa-cert type -1 debug1: identity file /Users/CYT/.ssh/id_dsa type -1 debug1: identity file /Users/CYT/.ssh/id_dsa-cert type -1 debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_6.2 debug1: Remote protocol version 2.0, remote software version OpenSSH_6.2 debug1: match: OpenSSH_6.2 pat OpenSSH* debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-ctr [email protected] none debug1: kex: client->server aes128-ctr [email protected] none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug1: Server host key: RSA 08:ad:8a:bc:ab:4d:5f:73:24:b2:78:69:46:1a:a5:5a debug1: Host 'my-ip' is known and matches the RSA host key. debug1: Found key in /Users/CYT/.ssh/known_hosts:1 debug1: ssh_rsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: Roaming not allowed by server debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey debug1: Next authentication method: publickey debug1: Trying private key: /Users/CYT/.ssh/id_rsa debug1: Trying private key: /Users/CYT/.ssh/id_dsa debug1: No more authentication methods to try. Permission denied (publickey). I have spent a few hours going through threads on various sites, including SO and SF, looking for a solution. It seems that the permissions for my files are all okay, but I just can't figure out the problem. Any help would be greatly appreciated. Edit: EEAA: Here are the outputs you requested: $ getent passwd git git:x:503:504::/home/git:/bin/bash $ grep ssh ~git/.ssh/authorized_keys | wc -l grep: /home/git/.ssh/authorized_keys: Permission denied 0

    Read the article

  • Backup solution

    - by user66115
    We are currently looking for a new backup solution. Our current network is 5 remote location with a tape backup in each plant. Right now we are looking at a MPLS VPN and running backups out of our main plant. The main thing that we backup are user private folders and department files. And each plant has it's own file server that houses CAD drawings. My main plan is to have every thing but that CAD drawing at the main faculty. We would start with a main backup of the drawing files and then do change backups back to the main plant. Besides tapes what would be the best way to backup. Our contact at Pc Connection is point us toward a Tandberg Data device.

    Read the article

  • Virtualbox, merging snapshots and base disk

    - by Henrik
    Hi, I have a virtual machine with about 30 snapshots in branches. The current development path is 22 snapshots plus the base disk. The amount of files is seemingly having an impact now on IO and the dev laptop I'm using (don't know if it is host disk performance issues with the 140GB total size over a lot of fragments, or just the fact that it is hitting sectors distributed across a lot of files). I would like to merge the current development branch of snapshots together with the base disk, but I am unsure if the following command would produce the correct outcome. I am not able to boot this disk after the procedure completes (5-6 hours). vboxmanage clonehd "C:\VPC-Storage\.VirtualBox\Machines\CRM\Snapshots\{245b27ac-e658-470a-b978-8e62137c33b1}.vhd" "E:\crm-20100624.vhd" --format VHD --type normal Could anyone confirm if this is the correct approach or not?

    Read the article

  • MySQL gzipped Export in PhpMyAdmin has wrong size in Mozilla

    - by Michal Gow
    That is really strange. I am using PhpMyAdmin 2.11.9.6 on Linux hosting. While I am Exporting databases using "gzipped" compression in Mozilla, I am getting files which have size of uncompressed database, but they seems to be downloading in incredible speed (10 times quicker than is possible using my ISP). So at the end: for database of 10M size I am getting 10M gzip downloaded in miniseconds it has indeed shown 10M size on drive it is corrupted Zip compression is working just fine (I am getting file with cca 1M size with fine content of compressed database) And the weirdest thing: that is happening for Mozilla Firefox (13.0.1) only, Internet Explorer 9 is downloading correct gzipped files... Any hint?

    Read the article

< Previous Page | 635 636 637 638 639 640 641 642 643 644 645 646  | Next Page >