Search Results

Search found 68873 results on 2755 pages for 'flat file'.

Page 769/2755 | < Previous Page | 765 766 767 768 769 770 771 772 773 774 775 776  | Next Page >

  • Is there any way to prevent a Mac from creating dot underscore files?

    - by SoaperGEM
    At work we're letting one of our very tech savvy clients actually help out a little with a few development projects specific to him. However, he uses his own personal Macbook, and as he edits files on our (Windows) networks, his Macbook always creates a bunch of unnecessary meta files that we end up deleting later. For instance, it creates a file called .DS_Store in any directory he opens, as well as "dot underscore" files for each file he edits. So for instance, if he's editing a file called "Main.php", his Macbook will create another file called "._Main.php". I know there are ways to prevent creation of .DS_Store files, but none about how to prevent creation of these hidden files prefixed with dot underscore. Is there any way to turn that off on Macs? Any way to prevent it from creating those files in the first place?

    Read the article

  • nginx rewrite rule to convert URL segments to query string parameters

    - by Nick
    I'm setting up an nginx server for the first time, and having some trouble getting the rewrite rules right for nginx. The Apache rules we used were: See if it's a real file or directory, if so, serve it, then send all requests for / to Director.php DirectoryIndex Director.php If the URL has one segment, pass it as rt RewriteRule ^/([a-zA-Z0-9\-\_]+)/$ /Director.php?rt=$1 [L,QSA] If the URL has two segments, pass it as rt and action RewriteRule ^/([a-zA-Z0-9\-\_]+)/([a-zA-Z0-9\-\_]+)/$ /Director.php?rt=$1&action=$2 [L,QSA] My nginx config file looks like: server { ... location / { try_files $uri $uri/ /index.php; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } } How do I get the URL segments into Query String Parameters like in the Apache rules above? UPDATE 1 Trying Pothi's approach: # serve static files directly location ~* ^.+\.(jpg|jpeg|gif|css|png|js|ico|html)$ { access_log off; expires 30d; } location / { try_files $uri $uri/ /Director.php; rewrite "^/([a-zA-Z0-9\-\_]+)/$" "/Director.php?rt=$1" last; rewrite "^/([a-zA-Z0-9\-\_]+)/([a-zA-Z0-9\-\_]+)/$" "/Director.php?rt=$1&action=$2" last; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } This produces the output No input file specified. on every request. I'm not clear on if the .php location gets triggered (and subsequently passed to php) when a rewrite in any block indicates a .php file or not. UPDATE 2 I'm still confused on how to setup these location blocks and pass the parameters. location /([a-zA-Z0-9\-\_]+)/ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME ${document_root}Director.php?rt=$1{$args}; include fastcgi_params; } UPDATE 3 It looks like the root directive was missing, which caused the No input file specified. message. Now that this is fixed, I get the index file as if the URL were / on every request regardless of the number of URL segments. It appears that my location regular expression is being ignored. My current config is: # This location is ignored: location /([a-zA-Z0-9\-\_]+)/ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index Director.php; set $args $query_string&rt=$1; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } location / { try_files $uri $uri/ /Director.php; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index Director.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; }

    Read the article

  • Get Matrox Millenium video card working in Ubuntu 9.10

    - by wcoenen
    I have installed Ubuntu 9.10 on an old PC and it is mostly working, except for some heavy drawing defects that show up whenever I start dragging a window or scrolling inside a window or menu. It looks like the video driver copies the rectangle being moved to the wrong location. I have taken a look in /var/log/Xorg.0.log and the following line shows the detected video card: (--) PCI:*(0:0:8:0) 102b:0519:0000:0000 Matrox Graphics, Inc. MGA 2064W [Millennium] rev 1, Mem@ 0xf9800000/16384, 0xfb000000/8388608, BIOS @0x????????/65536 (==) Using default built-in configuration (30 lines) (==) --- Start of built-in configuration --- Section "Device" Identifier "Builtin Default mga Device 0" Driver "mga" EndSection How do I fix the drawing defects? It turned out that the 24 bit color depth (automatically selected by ubuntu 9.10) was the problem; apparantly the mga driver doesn't handle this well for cards with little memory. I took the following steps to resolve the issue (you can skip the first three steps if you already have a semi-working xorg.conf file): Reboot ubuntu in recovery mode, to get a root console without X running. Run Xorg -configure to generate a xorg.conf.new file Copy the file to /etc/X11/xorg.conf with cp xorg.conf.new /etc/X11/xorg.conf (assuming it didn't exist yet; that's why I generated it) Open the new config file with sudo nano /etc/X11/xorg.conf and make sure the screen section is configured for 16 bit color depth like this: Section "Screen" Identifier "Screen0" Device "Card0" Monitor "Monitor0" DefaultDepth 16 SubSection "Display" Viewport 0 0 Depth 16 Modes "1024x768" EndSubSection EndSection I can't guarantee those were the only important changes I made - I tried a few things in my attempts to create a valid xorg.conf file. But I'm pretty sure that the screen section was the important part.

    Read the article

  • My server's been hacked EMERGENCY

    - by Grant unwin
    I'm on my way into work at 9.30 p.m. on a Sunday because our server has been compromised somehow and was resulting in a DOS attack on our provider. The servers access to the Internet has been shut down which means over 5-600 of our clients sites are now down. Now this could be an FTP hack, or some weakness in code somewhere. I'm not sure till I get there. How can I track this down quickly? We're in for a whole lot of litigation if I don't get the server back up ASAP. Any help is appreciated. UPDATE Thanks to everyone for your help. Luckily I WASN'T the only person responsible for this server, just the nearest. We managed to resolve this problem, although it may not apply to many others in a different situation. I'll detail what we did. We unplugged the server from the net. It was performing (attempting to perform) a Denial Of Service attack on another server in Indonesia, and the guilty party was also based there. We firstly tried to identify where on the server this was coming from, considering we have over 500 sites on the server, we expected to be moonlighting for some time. However, with SSH access still, we ran a command to find all files edited or created in the time the attacks started. Luckily, the offending file was created over the winter holidays which meant that not many other files were created on the server at that time. We were then able to identify the offending file which was inside the uploaded images folder within a ZenCart website. After a short cigarette break we concluded that, due to the files location, it must have been uploaded via a file upload facility that was inadequetly secured. After some googling, we found that there was a security vulnerability that allowed files to be uploaded, within the ZenCart admin panel, for a picture for a record company. (The section that it never really even used), posting this form just uploaded any file, it did not check the extension of the file, and didn't even check to see if the user was logged in. This meant that any files could be uploaded, including a PHP file for the attack. We secured the vulnerability with ZenCart on the infected site, and removed the offending files. The job was done, and I was home for 2 a.m. The Moral - Always apply security patches for ZenCart, or any other CMS system for that matter. As when security updates are released, the whole world is made aware of the vulnerability. - Always do backups, and backup your backups. - Employ or arrange for someone that will be there in times like these. To prevent anyone from relying on a panicy post on Server Fault. Happy servering!

    Read the article

  • Prevent Chrome from automatically opening downloaded PDF and Image files

    - by Phoenix
    When I download a PDF or image in Google Chrome on my Mac, is it possible to prevent Chrome from automatically opening it in my default application for that file type (e.g., Preview)? I notice that Chrome does not do this for other downloaded files such as audio and ZIP archives. I still want to be able to preview files in Chrome; I just want to prevent it from automatically launching my image/PDF viewer application after I download them. For example: I click on a link in an email to a PDF document or an image file. Chrome displays the contents in the browser. I press Cmd-S and save the file to my computer. When the download finishes, the file opens automatically in Preview.app. It's that last step that I would like to bypass.

    Read the article

  • Clean logging with BASH

    - by Matt Krouse
    I have a script that deletes files 7 days or older and then logs them to a folder. It logs and deletes everything correctly but when I open up the log file for viewing, its very sloppy. log=$HOME/Deleted/$(date) find $HOME/OldLogFiles/ -type f -mtime +7 -delete -print > "$log" The log file is difficult to read Example File Output: (when opened in notepad) /home/u0146121/OldLogFiles/file1.txt/home/u0146121/OldLogFiles/file2.txt/home/u0146121/OldLogFiles/file3.txt Is there anyway to log the file nicer and cleaner? Maybe with the Filename, date deleted, and how old it was? Any suggestions help!

    Read the article

  • Zip files way larger on a Mac using Finder than the 'zip' command.. 2x larger.

    - by user33947
    I have a directory of JPEG's. Each one is roughly 90k, as reported by Photoshop when saving, and also reported by the command line function 'ls'.. When I get the properties for the file with Finder, it's double that, over 220k. Zipping it with finder will also package this bulk as well. Doing the "zip -v test.zip ./dir" command will make a MUCH smaller zip file. Zipping the files on windows also results in a much smaller file size as well, roughly the same to that of the unix zip command. File sizes are also reported correctly on windows. I can't find any mention of this anywhere, so I'm asking here.

    Read the article

  • Notepad++'s pesky EOL Format switching -- how to remove the invisible (default?) keyboard shortcuts Ctrl+M, Ctrl+J

    - by AKE
    Notepad++ lets the user specify whether the end of lines (EOL) format for a file should be entirely in Windows, Unix, or Mac formats. Notepad++ also remember the last one encountered in a file and uses that EOL format when a new file is created. But Notepad++ seems to have some pesky default keyboard shortcuts built-in that can create MIXED format files, creating havoc with this otherwise quite reasonable situation. Specifically: - Ctrl+M puts a Mac style EOL, i.e. (0x0D only), - Ctrl+J puts a UNIX style EOL, i.e. (0x0A only), The hazard is that rapid typing and using the keyboard heavily with other shortcut commands could inadvertantly mean typing oone of these above, each time turning at least one line in the file into another EOL format. So my Question: How can I turn OFF these apparently built-in keyboard shortcuts. Please Note: I've already scanned through Settings > Shortcut Mapper and could not find Ctrl+M or Ctrl+J listed for EOL conversion. Thanks,

    Read the article

  • Organize code in Chef: libraries, classes and resources

    - by ColOfAbRiX
    I am new to both Chef and Ruby and I am implementing some scripts to learn them. Now I am facing the problem of how to organize my code: I have created a class in the library directory and I have used a custom namespace to maintain order. This is a simplified example of my file: # ~/chef-repo/cookbooks/mytest/libraries/MyTools.rb module Chef::Recipe::EP class MyTools def self.print_something( text ) puts "This is my text: #{text}" end def self.copy_file( dir, file ) cookbook_file "#{dir}/#{file}" do source "#{dir}/#{file}" end end end end From my recipe I call both methods: # ~/chef-repo/cookbooks/mytest/recipes/default.rb EP::MyTools.print_something "Hello World!" EP::MyTools.copy_file "/etc", "passwd" print_something works fine, but with copy_file I get this error: undefined method `cookbook_file' for Chef::Recipe::EP::FileTools:Class It's clear to me that I don't know how to create libraries in Chef or I don't know some basic assumptions. Can anyone help me, please? I am looking for a solution of this problem (organize my code, libraries, use resources in classes) or, better, a good Chef documentation as I find the documentation very deficient in clarity and disorganized so that research through it is a pain.

    Read the article

  • Percona system tables corrupted.

    - by Anand Jeyahar
    I am having problems setting up mysql replication with a percona as server. accidentally, took a full dump from mysql and restored it on percona and then started,the replication. now when i stop slave and start slave, i am getting the error "[ERROR] Failed to open the relay log './s5-bin.000003' (relay_log_pos 2029993) 110103 9:15:59 [ERROR] Could not find target log during relay log initialization " But show local variables shows the relay_log variable as set in the cnf file.. But the relay-log variable is set to slave-relay-bin alright. I am able to start mysql as a service. But mysqld_safe fails with error "110103 9:19:39 [ERROR] /usr/sbin/mysqld: Can't create/write to file '/var/run/mysqld/mysqld.pid' (Errcode: 2) 110103 9:19:39 [ERROR] Can't start server: can't create PID file: No such file or directory " Am now lost as to what is the problem.

    Read the article

  • Would shell command join cause out of memory?

    - by Hancy
    I have two file to join. FILE 1: a A1 a A2 a A3 ... c C1 c C2 ... FILE 2: a feature1_of_a a feature2_of_a ... a featureN_of_a ... ... c feature1_of_c c feature2_of_c ... after join, i could get File like this: A1 feature1_of_a A2 feature1_of_a A3 feature1_of_a A1 feature2_of_a A2 feature2_of_a A3 feature2_of_a ... A1 featureN_of_a A2 featureN_of_a A3 featureN_of_a ... In order to do that: i wrote shell command join -11 -21 -o1.2,2.2 file1 file2. But the problem is: number N might be huge. So if join read all feautre of a into memory at once, memory might not be enough. I don't know how join is implemented. WQould the momery become a problem? If so, is there any way to get what I want?

    Read the article

  • Multiple .bkf files created in Backupexec 12.5 or 2010 related to heavy I/O?

    - by syuusuke
    Hey everyone, I was wondering if anyone who has used backupexec 12.5 or 2010 have ever experienced multiple .bkf files created for a single job. To describe what I mean by multiple files, the .bkf are being created with random file sizes under 2GB even though I've assigned the setting to chop the file after 10GB size. Some jobs will create 20x .bkf files in 1 job with file chunks ranging from 50MB to 800MB sizes. Is this is a sign of heavy I/O issues? Bandwidth limitations? I'm not sure, I'm here to seek some advices and suggestions. I've setup another backup server with the same exact settings and they seem to create a new .bkf file when 10GB limit has been reached. Although I am backing up different machines but I know my settings are an exact match to the problematic or atleast I think it's a problem.

    Read the article

  • Powershell Set-Acl fails

    - by Ulrich
    While working on a little backup script I try to change the ACL of a file using Set-Acl in Powershell 1 on Vista and always get the following error message: Set-Acl : The security identifier is not allowed to be the owner of this object. This error persists even if I go a minimal script: $acl = Get-Acl $sourcepath$file $acl |format-list Set-Acl -path $sourcepath$file -AclObject $acl Does anyone know the reason for this error? Obviously I'm not changing the ownership of the file... BTW: What I ultimately want to achieve is to reduce all access rights to ReadAndExecute. Is there maybe an easier way of doing this in Powershell? Thanks for your help! Ulrich

    Read the article

  • How to cleanup tmp folder safely on Linux

    - by Syncopated
    I use RAM for my tmpfs /tmp, 2GB, to be exact. Normally, this is enough but sometimes, processes create files in there and fail to cleanup after themselves. This can happen if they crash. I need to delete these orphaned tmp files or else future process will run out of space on /tmp. How can I safely garbage collect /tmp? Some people do it by checking last modification timestamp, but this approach is unsafe because there can be long-running processes that still need those files. A safer approach is to combine the last modification timestamp condition with the condition that no process has a file handle for the file. Is there a program/script/etc that embodies this approach or some other approach that is also safe? Incidentally, does Linux/Unix allow a mode of file opening with creation wherein the created file is deleted when the creating process terminates, even if it's from a crash?

    Read the article

  • Computer no display from connection problem.

    - by whamsicore
    My computer is booting but has no display. The optical drive has power, but the USB doesn't, so I figured that I had a broken motherboard. However I tried to boot my computer again laying flat, and it worked fine for two seconds, and then froze like it did the first time. It had no display after that. I have come to the conclusion that it is a connection problem (someone had a similar problem on one of the online forums).

    Read the article

  • Fixed ruby/mysql connection with new libmysql.dll, and broke Apache in the process

    - by jmtoporek
    Ok so bit of background - all my development has been on a local Windows 7 machine. I had Apache with PHP/MySQL running with no issues. Been using ruby (1.9.3 and latest rails release 3.2.9) with built in webrick server, but had a devil of a time connecting to mysql. Did some research, updated my libmysql.dll file in c:/ruby/bin and it worked! Very happy... except now Apache stopped working. In my attempt to resolve the issue I found an older copy of libmysql.dll, renamed the new file, copied the old file back to c:ruby/bin and apache works, ruby does not. So I can take this ass backwards approach but obviously this seems pretty stupid. I was surprised that Apache was using the dll file in ruby/bin folder. I presume this is related to path variables perhaps? I guess I was hoping someone could direct me as to how I can use one dll file for apache and another for ruby. Or if you have some other smarter approach - I've smart enough to follow directions to install apache from scratch and enable php on windows as well as ubuntu, but I'm not much of a sys admin, just a semi competent web developer.

    Read the article

  • Playing audiobooks in Windows. [closed]

    - by Phenom
    Possible Duplicate: What media player application can remember where you last paused/stopped the track? My audiobooks are in mp3 format. Each chapter is a file. Sometimes I will stop playing in the middle of a file. When I click on the file again I want to continue where I left off. That's what my ipod touch does. But for Windows, are there any programs that can do that?

    Read the article

  • Automate opening HTML and printing to PDF

    - by craigpatik
    I need a way to automate the following process in Windows 7: Open an .html file in Internet Explorer Print to PDF Save the PDF with a patterned file name (i.e., original_name_YYYY-MM-DD.pdf) Ideally, I could drag and drop several files or open a whole folder of files at once and a PDF would be created for each one. A command line solution is also acceptable. The files have to be opened in the browser because parts of the page are rendered with JavaScript on page load. In other words, if you simply right-click on the file in Explorer and choose "print", the resulting file is not the same because the JS didn't run. If it helps, Internet Explorer can be set as the default browser, and a PDF printer can be set as the default printer.

    Read the article

  • Looking for a way to diff many server's filesystem

    - by Itai Ganot
    There's a motivation in my office to make sure that all file systems on all servers in the production environment (It's a Windows environment) are identical until the last file and i'm looking for a program/tool which can help me achieve this goal. What i actually look for is a tool that will allow me to diff server's file systems by connecting them remotely (as they are spreaded around the world). Anyone knows a tool which allows it?

    Read the article

  • How do I mount a sparse disk image permanently?

    - by Mike
    On Mac OS X 10.6.7, when I mount a sparse disk image (either by double-clicking it or using hdid from the command line), the image: Appears on my desktop Needs to be re-mounted every time I log in I'd like to set up the equivalent of an /etc/fstab which will mount the image when the system boots, and make it permanent - so I don't have to worry if my symbolic links will resolve or not. Is this more trouble than it's worth on a Mac? I noticed that there is no /etc/fstab, and /etc/fstab.hd contains a dire warning: IGNORE THIS FILE. This file does nothing, contains no useful data, and might go away in future releases. Do not depend on this file or its contents. I tried sudo hdid -notremovable <image>, which seemed like half of what I wanted (according to man hdid), but it failed with an error: hdid: attach failed - no mountable file systems.

    Read the article

  • What is a "good" tool to password-protect .pdf files?

    - by Marius Hofert
    What is a "good" tool to encrypt (password protect) .pdf files? (without being required to buy additional software; the protection can be created under linux but the password query should work on Windows, too) I know that zip can do it: zip zipfile_name_without_ending -e files_to_encrypt.foo What I don't like about this is that for a single file, you have to use Winzip to open the zip and then click the file again. I rather would like to be prompted for a password when opening the .pdf (single file case). I know that pdftk can do this: pdftk foo.pdf output foo_protected.pdf user_pw mypassword. The problem here is that the password is displayed in the terminal -- even if you use ... user_pw PROMPT. But in the end you get a password-protected .pdf and you are prompted for the password when opening the file.

    Read the article

  • Cygwin creates files with special (shared) icons on windows

    - by barjonah
    I use cygwin to transfer files between linux and windows machines. Everytime I transfer a file to a windows machine it adds an extra shared user icon on the file's or folder's icon itself. This also happens if I create a file from cygwin on windows using pretty much any command: echo, vim, nano, cat. This is what the cygwin-created folders (and files) look like. This is what a normal folder (or file) looks like. I'm thinking it has to do with permission, because I'd have to chmod it everytime if I want other applications to access the files or folders on windows. How can I tell cygwin to create regular ol' files just like a user or any program would?

    Read the article

  • Is it possible to continue torrent downloads from other client

    - by Nrew
    I have downloaded a torrent file but I downloaded it using bitlord. And because bit lord doesnt have the ability like initial seeding. I decided to use utorrent. Ive already tried to continue it. What I did was to: Opened the original torrent file in utorrent. Target where the incomplete bitlord incomplete file was. But it seems that utorrent isn't detecting the file and it makes another copy of it and start to download from the beginning. Im at about 80 percent and there are only 18 seeds but there's 17 leech. And my connection is slow so I really have to initial seed. Other details: I've reformatted the original machine where I downloaded it. Also didn't copy the original application.

    Read the article

< Previous Page | 765 766 767 768 769 770 771 772 773 774 775 776  | Next Page >