Search Results

Search found 39200 results on 1568 pages for 'zip files'.

Page 125/1568 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • unlock database files when SQL server is idle

    - by Andy
    In my development/test environment on my laptop, I can't back up the SQL server database files because the file handles are kept permanently open by SQL server (VSS doesn't work because the drive is truecrypted) I was hoping there may be some setting in SQL server that can make it unlock the data files after a certain period of inactivity and automatically open them again on demand, but I can't find anything. I don't really want to be dumping the database out every night because it's only a development environment. apart from stopping sql server before I do the backup is there any other solution?

    Read the article

  • Managing trace files on Sql Server 2005

    - by Sophtware
    I need to manage the trace files for a database on Sql Server 2005 Express Edition. The C2 audit logging is turned on for the database, and the files that it's creating are eating up a lot of space. Can this be done from within Sql Server, or do I need to write a service to monitor these files and take the appropriate actions? I found the [master].[sys].[trace] table with the trace file properties. Does anyone know the meaning of the fields in this table?

    Read the article

  • Import data from multiple CSV files to an Excel sheet

    - by Chetan
    I need to import data from 50 similar csv files to a single excel sheet. Is there any way to get only selected columns from each file and put them together in one sheet. Structure of my csv files: A few columns exactly same in all the files. (I want them to be in excel) then one column with same column name but different data which I want to place next to each other with different names in the excel sheet. I do not not want all other remaining columns from csv files. In short, read all csv files, get all the columns which are common to all csv & put in excel sheet. Now, take one column from each file which has the same header but different data and put one after the other in excel sheet with named from the csv file name. Leave remaining rest of the columns. Write excel sheet to a excel file. Initially I thought it can be easily done, but considering my programming skill in learning stage it is way difficult for me. Please help.

    Read the article

  • Uploading files to a server that has Real Time Antivirus scan running

    - by zecougar
    I need to allow users to upload files onto a server that has an antivirus program running with real-time scanning switched on. What would be a good design to ensure that infected files are not uploaded to the server. Questions - would large files be copied onto disk and then immediately scanned, or would they be scanned as they are copied and not allowed to appear on disk if infected Should i build a seperate infrastructure around this to specifically ionvoke a scan on the copied file ? this might be an issue if the file is deleted through the real-time scan

    Read the article

  • .htaccess - deny downloading of files

    - by user317005
    I keep several fonts in the directory "/fonts/" on my server which I then load into my css files via @font-face. However, I want to make sure that people cannot download the file just by simply going to http://www.domain.com/fonts/fontname.ttf. Can I somehow prevent this, and still be able to load the font files into my css files? Because I think putting deny from all into the .htaccess file will even prevent the css files for correctly loading the fonts. I hope this makes sense.

    Read the article

  • Edit write-protected files by breaking hard links

    - by Taymon
    A directory which I own and can write to contains hard links to files that I don't own and don't have write permission for. I want to open and edit these files in Emacs. When I save my changes, Emacs should rename the existing hard link by appending ~, then write my new version of the file as a new file owned by me. I was under the impression that Emacs could just do this (because of the way it does backups), but it's not working; when I save, it attempts to change the file's permissions in order to write to it (and fails because I don't own the file). How do I make this happen?

    Read the article

  • About using assembly with c

    - by kristus
    Hi. I've sort of just finished a mandatory task at school, and I'm about to deliver it. But then I came across something that was unfamiliar, header files. :( What I've got: test-program.c task_header.h function1.s function2.s function3.s function4.s test-program.c: #include <stdio.h> #include <stdlib.h> #include <string.h> #include "task_header.h" . .. ... task_header.h: extern void function1(...); extern void function2(...); extern int function3(...); extern void function4(...); And then I use the command: gcc -m32 -o runtest test-program.c function1.s function2.s function3.s function4.s Is this a proper way to do it, or is it possible to modify it? So I can type: gcc -m32 -o runtest test-program.c ?

    Read the article

  • Unable to move or delete files

    - by Erik
    Hi: Just today I got the following error while trying to move/delete several files: The action can't be completed because the file is open in another program. The file wasn't open, but just in case, I closed all programs. When that failed to allow me to move or delete the file, I restarted the computer. When that failed to let me move/delete I came here. Any suggestions? The files can be copy/pasted but move/delete fails even after multiple restarts.

    Read the article

  • Download web server structure with empty files

    - by golimar
    I want to make a mirror of a Web server, but downloading the actual files will take too long. So I thought of having just the directory and file structure, and when I need the actual contents of the file, I can download just that file. I have tried wget --spider URL and in a short time it has created in my local disk the directory structure with no files. But I've checked all of wget's or curl's switches and there is nothing like what I need. Can this be done with wget, curl or any other tool?

    Read the article

  • Can I Store MediaWiki Files on the cloud?

    - by user219048
    I recently got a chromebook, and I've been brainstorming different ways to put mediawiki on it (with localhost, not a server). One way I've read about online is to go into developer mode to download and set up LAMP. I was wondering, wouldn't I be able to store the apache, mysql, php, and mediawiki files on the cloud (google drive)? And if so, would anything prevent me from accessing my wiki on any other computer's localhost, assuming I could just log into Google Drive to access these files? Might there be any reduced performance when operating from the cloud?

    Read the article

  • minifying patched javascript files

    - by Stacia
    I'm writing a Rails app and I've partially integrated in this nice little patch to the in line ajax editor: http://inplacericheditor.box.re/ The problem is, on that page I have tinymce, prototype and scriptaculous included. In Firefox at least there's a big lag when all this stuff is loading. I was hoping to fix it by compressing the files so I checked out a plugin for rails called Smurf. It seemed to do what it was supposed to do nicely, but it choked on the little patch files that are included with the Ajax editor thing. THe patch files look like this: Object.extend(Ajax.InPlaceEditor.prototype, { handleAJAXFailure: function(transport) Alternatively, should I just be catching them instead of worrying about minfying them? I know I'm running on development and that Apache would maybe be handling serving the js files differently..It just seems like a lot of things to serve on one page.

    Read the article

  • How to have CVS files in different directory than source files in NetBeans?

    - by Ondrej Slinták
    I have a project in NetBeans which haven't used CVS until now. Let's say the directory with source files is called /www/source_files and directory with project files /www/project_files. Module in repository is called differently than source files directory. When I'm trying to check out CVS, it forces me to create a directory called exactly how module is called, which is fine by me in fact. Straight after it asks me if I wanted to create new project. And here the problem begins. I don't want to do that and I have no idea how to link newly created directory with CVS and checked out files with my project. I'd like to end up with following structure: /www /source_files /project_files /cvs_files Any ideas how to do this? I'm using NetBeans 6.8.

    Read the article

  • Editing remotely the PHP files on a Centos server

    - by Alex2012
    I have a intranet web server (Centos 6, Apache, PHP) to which I would like to give access to a developer. He will connect by remote desktop from Windows 7 to Ubuntu 12.4 and from here by SSH to /var/www/html folder where it has to create and edit the files. This solution was chosen because: - I could not make a remote desktop connection from Windows to Centos - The web developer need some editor for PHP files and is not allowed to install software on Windows 7 machine - it is more a test solution ( we are all learning to use Linux). When the developer is connected from Ubuntu to Centos by SSH (SFTP) he could save the changes only if on Centos the account used to connect has ownership to that folder. Can you please tell how can I give all required rights. I tried different solutions found on Internet but without to much success. Are there other way to connect to Centos server?

    Read the article

  • Apache2 WebServer not allowing me to view website/files in /var/www

    - by CitadelCSAlum
    I used to be able to access websites/files that were stored in the directory /var/www I have not used this for a while, but now I have a need to store, media in this directory or in the directory/var/www/images I noticed that my apache web server wasnt running correctly so I did a complete package removal and then reinstalled, but I am still unable to access a test page inde.html in the directory /var/www/index.html by going to http://myipaddresshere/index.html Is there some initial configuration I need to do to allow me to store HTML and media files in this directory and be able to access them from the browser? I dont remember having to do anything before.

    Read the article

  • Excel macro to change location of .cub files used by pivot tables? (to allow .xls files that depend

    - by Rory
    I often use Excel with pivot tables based on .cub files for OLAP-type analysis. This is great except when you want to move the xls and you realise internally it's got a non-relative reference to the location of the .cub file. How can we cope with this - ie make it convenient to move around xls files that depend on .cub files? The best answer I could come up with is writing a macro that updates the pivot tables' reference to the .cub file location....so I'll pop that in an answer.

    Read the article

  • JAVA: multiple files download at the same time?

    - by user319096
    hi guys, Is there any methods for downloading multiple files at the same time? That is, after select multiple files, click the download button, and choose the destination directory, the files selected will be downloaded at the same time. i googled it and not find any solutions, can anybody know? im using struts1 and spring2.

    Read the article

  • SFTP sending files between laptops on Ubuntu

    - by twigg
    I want to transfer files between two Ubuntu systems using SFTP. I have got it set-up and I can connect to the other laptop, ping it and see its file list using sftp> dir. I can see the files on the other system. But when I call get filename.deb it comes up saying Fetching /home/user/filename.deb to filename.deb 0% 0 0.0KB/s --:-- ETA and then drops back to the sftp command promote without transferring anything. Have I missed something?

    Read the article

  • List files recursively and sort by modification time

    - by Problemaniac
    How do I list all files under a directory recursively and sort the output by modification time? I normally use ls -lhtc but it doesn't find all files recursively. I am using Linux and Mac. ls -l on Mac OS X can give -rw-r--r-- 1 fsr user 1928 Mar 1 2011 foo.c -rwx------ 1 fsr user 3509 Feb 25 14:34 bar.c where the date part isn't consistent or aligned, so a solution have to take this into account. Partial solution stat -f "%m%t%Sm %N" ./* | sort -rn | head -3 | cut -f2- works, but not recursively.

    Read the article

  • Mass modify all php files on my server

    - by anslume
    I would like to delete a php code on all my php files on my debian server. indeed I would like to get rid of a line: eval(base64_decode("DQplcnJvcl9yZXBvcnR")); It's present in many of my phpfiles. That's why I would like to find a script which is going to look it up in all my php files and replace itwith nothing? Do you have any idea how i could do that ? I know how to do it on windows with some software (notepad++ is very useful) but no idea how can I do that in a command line through ssh Thanks for your answer, Ans

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >