Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 462/1981 | < Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >

  • LED Display control software

    - by user978733
    My university bought new big led display from chinese manufacturers. What I want to do is, show some visualizations (like Windows Media Player, Winamp, Itunes ... does) with music. I'd just drag the application window to show on screen But main problem is,The software that controls it (called "Led Vision") doesn't support showing applications' windows: It shows limited types of the files, such as, powerpoint presentation, video files, picture files.. etc. Now the question is, where I can find the visualization video files? something that created in After Effects, looks like that:

    Read the article

  • Robocopy hiding folders on backup drives

    - by Neil Barnwell
    I have a backup batch file that uses Robocopy to backup my files: robocopy "C:\" "G:\Default\RoboCopyBackup\C" /XF Pagefile.sys /XD "System Volume Information" "Recycler" "Temporary Internet Files" "Installer Cache" "Temp" /E /R:1 /W:0 /TEE /XJ This should create a folder structure on the external backup drive like so: G:\Default\RoboCopyBackup\C\... However, G: appears totally empty. What is weird, is that the folders and files are there! If I type the above path into the address bar, I see all the files and folders! Can anyone help me work out why? I think it might be some NTFS-based ownership/permissions thing but I'm not sure.

    Read the article

  • How to convert tar file from gnu format to pax format

    - by nosid
    On the one hand I have a lot of tar files created with gnu format, and on the other hand I have a tool that only supports pax (aka posix) format. I am looking for an easy way to convert the existing tar files to pax format - without extracting them to the file system and re-create the archives. GNU tar supports both formats. However, I haven't found an easy way to the conversion. How can I convert the existing gnu tar files to pax?

    Read the article

  • Nginx + Wordpress Multisite 3.4.2 + subdirectories + static pages and permalinks

    - by UrkoM
    I am trying to setup Wordpress Multisite, using subdirectories, with Nginx, php5-fpm, APC, and Batcache. As many other people, I am getting stuck in the rewrite rules for permalinks. I have followed these two guides, which seem to be as official as you can get: http://evansolomon.me/notes/faster-wordpress-multisite-nginx-batcache/ http://codex.wordpress.org/Nginx#WordPress_Multisite_Subdirectory_rules It is partially working: http://blog.ssis.edu.vn works. http://blog.ssis.edu.vn/umasse/ works. But other permalinks, like these two to a post or to a static page, don't work: http://blog.ssis.edu.vn/umasse/2008/12/12/hello-world-2/ http://blog.ssis.edu.vn/umasse/sample-page/ They either take you to a 404 error, or to some other blog! Here is my configuration: server { listen 80 default_server; server_name blog.ssis.edu.vn; root /var/www; access_log /var/log/nginx/blog-access.log; error_log /var/log/nginx/blog-error.log; location / { index index.php; try_files $uri $uri/ /index.php?$args; } # Add trailing slash to */wp-admin requests. rewrite /wp-admin$ $scheme://$host$uri/ permanent; # Add trailing slash to */username requests rewrite ^/[_0-9a-zA-Z-]+$ $scheme://$host$uri/ permanent; # Directives to send expires headers and turn off 404 error logging. location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires 24h; log_not_found off; } # this prevents hidden files (beginning with a period) from being served location ~ /\. { access_log off; log_not_found off; deny all; } # Pass uploaded files to wp-includes/ms-files.php. rewrite /files/$ /index.php last; if ($uri !~ wp-content/plugins) { rewrite /files/(.+)$ /wp-includes/ms-files.php?file=$1 last; } # Rewrite multisite '.../wp-.*' and '.../*.php'. if (!-e $request_filename) { rewrite ^/[_0-9a-zA-Z-]+(/wp-.*) $1 last; rewrite ^/[_0-9a-zA-Z-]+.*(/wp-admin/.*\.php)$ $1 last; rewrite ^/[_0-9a-zA-Z-]+(/.*\.php)$ $1 last; } location ~ \.php$ { # Forbid PHP on upload dirs if ($uri ~ "uploads") { return 403; } client_max_body_size 25M; try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi_params; } } Any ideas are welcome! Have I done something wrong? I have disabled Batcache to see if it makes any difference, but still no go.

    Read the article

  • Find and Replace String in filenames

    - by shekhar
    I have thousands of files with no specific extensions. What I need to do is to search for a sting in filename and replace with other string and further search for second string and replace with any other string and so on. I.e.: I have multiple strings to replace with other multiple strings. It may be like: "abc" in filename replaced with "def" * String "abc" may be in many files "jkl" in filename replaced with "srt" * String "jkl" may be in many files "pqr" in filename replaced with "xyz" * String "pqr" may be in many files I am currently using excel macro to get the file names in excel and then preserving original names in one column and replacing desired in the content copied in other column. then I create a batch file for the same. Like: rename Path\OriginalName1 NewName1 rename Path\OriginalName2 NewName2 Problem with the above procedure is that it takes a lot of time as the files are many. And As I am using excel 2003 there is limitation on number of rows as well. I need a script in batch like: replacestr abc with def replacestr pqr with xyz in a single directory. Will it be better to do in unix script?

    Read the article

  • Wrong java -version being reported

    - by Malachi
    I am running Windows 7 Professional x64 and have the following Java versions installed: x64 C:\Program Files\Java jdk1.6.0_24 jdk1.7.0_04 jdk1.7.0_07 jre6 jre7 x86 C:\Program Files (x86)\Java jre1.6.0_07 jre6 jre7 in my environment variables I have my PATH containing C:\Program Files\Java\jdk1.6.0_24\bin and JAVA_HOME set to C:\Program Files\Java\jdk1.6.0_24\bin However running java -version reports java version "1.7.0_07" Java(TM) SE Runtime Environment (build 1.7.0_07-b10) Java HotSpot(TM) 64-Bit Server VM (build 23.3-b01, mixed mode) How is this the case when there is no reference to this version of Java in my Environment variables. Any help on this issue would be great as I am trying to run Apache ANT using Java 1.6.

    Read the article

  • FTP Access Denied when uploading to server

    - by Albert
    Ok, here's the story. I have a server running FTP 'out there' I can connect to it using the admin account, browse files, download files. When I try to upload files, I get 550 Access Denied. I have tried through FileZilla and command line. I have windows firewall turned off (on my machine) I can UPLOAD files from another machine (using the same admin account) on our local network (that means, same public IP) what is the problem? I am running Windows 7, Build 7100 and the other machine on the network is running XP SP3 The thing that gets me though, is that this worked for the last probably 4 months, without a problem, I get back in the office after a weekend today and it won't work...

    Read the article

  • Loads of memory in "standby" on Windows Server 2008 R2

    - by Jaap
    In our SharePoint farm, our Web Front End servers all have loads of memory in "standby" mode, meaning very little is available for our IIS worker process. We have 32 GB of RAM in each of the boxes, and standby memory will creep up to about 28 GB, whereas the IIS worker process only seems to be using about 2 GB. Also, we've seen the machine use the swap file extensively while this memory was in standby, so I am starting to think that this memory in standby mode is stopping IIS from using it, forcing it to swap to disk, causing more performance problems. I used SysInternals RamMap to indentify what is being kept in memory, and it was able to tell me that almost everything in standby memory is of type "Mapped File". When I sort the files listed under the file summary tab in RamMap by file size, the largest files (around a few hundred meg each) are IIS log files and SharePoint log files. I would like to understand which process is loading these files into standby memory and why they are not being released. When I do an iisreset, it does not release the memory. Any ideas? Thanks!

    Read the article

  • Mavericks permission issues with Windows Server deduplicated shares

    - by dmohlmaster
    We have a number of 10.9-10.9.3 - Mavericks - machines installed throughout our facility. Much of the user content is pulled from shares stored on our Windows Server 2012 fileservers with deduplication enabled. I have found that files newly written or unoptimized are able to be accessed without issue - read, written, modified, etc. Once the file gets optomized/deduplicated and Windows adds the P & L attributes - sparse and symlink - the Macs running Mavericks begin to have access issues. Once the files get deduplicated, users begin receiving read access errors when copying files (see error1 below). This happens when copying to folders within the current folder tree or copying somewhere to the local system. If you 'stop' the copy operation and retry a few more times, it may eventually work for the specific instance but fail again later. I am however, able to copy these files without issue via the terminal. Other systems running 10.7 do not experience the same issues and are able to access file server resources without issue. Many of the systems having issues are newer and thus not able to be downgraded to 10.8 or 10.7. I have tried finder replacements such as Pathfinder but the results are the same. I know this is at least similar to the issues many Mac users are already experiencing and posting about but I haven't seen it directly linked to deduplication and the attributes written by Windows server. Has anyone seen this issue? Have any solutions been found? Error 1: When copying files after the PL attributes have been set by deduplication. "One or more items can't be copied to "Foler" because you don't have permissions to read them. ******************************************' Via the system.log, I am also seeing the following error when accessing these deduplicated file shares. The reparse point tag listed below is "IO_REPARSE_TAG_DEDUP" Reported error: "smbfs_nget: filename.ext - unknown reparse point tag 0x80000013"

    Read the article

  • Mercurial Scenario

    - by richzilla
    Hi all, I have a scenario in mercurial, and i cant finad anything that would tell me how to solve it. Basically, i have a mercurial repository with numerous branches for stable, development, experimental features etc.... However, ive found a bug in a set of core application files that are common to each branch. Is there a way to modify these files, and then push the changes to the common files to all the other branches, without sending any other changes? any help would be appreciated.

    Read the article

  • How can I improve performance over SMB/CIFS for an application that has poor write speeds?

    - by Jeremy
    I have a third party application that reads several large files and generates a third large file. Its performance is quite good when the generated file is stored on "local storage", i.e. either a direct attached or iSCSI-based disk. The source files that are read can be stored remotely on our NAS and accessed via SMB with little effect on performance. However, if we attempt to write the target file to any kind of SMB/CIFS share (Samba or Windows Server) the performance drops almost ten-fold. This is unacceptably slow in our case. Writing files to network shares is not otherwise slow. I can copy large files to SMB shares and get great performance - near what I would expect is possible given the disks and network in question. I have a theory that this application's problem with SMB shares has something to do with a lack of write caching over the share and perhaps lots of network roundtrips. Is this possible and is there anything that can be done about it?

    Read the article

  • Windows Server 2008 R2 DFS Root Namespace Required?

    - by caleban
    I would prefer to set up our DFS such as: \domain.local\users \domain.local\customers \domain.local\support etc. Is this a problem? Do I need to instead set all of the above folders as targets under a root such as: \domain.local\files\users \domain.local\files\customers \domain.local\files\support Other than the path being shorter in the top example, which is what I would prefer, is there a difference in functionality in Windows DFS between the two examples shown? Thanks in advance.

    Read the article

  • There is not enough space on the disk when there is?

    - by Lee Tickett
    Permissions are fine (inherited) and checking effective permissions everything is AOK. As you can see i can make a file in the docs folder but not the pdf_docs subfolder. The folder has a lot of files and is quite large- i wonder if i've reached a limit? I couldn't find anything on google. Size: 51.0 GB (54,819,804,885 bytes) Size on disk: 52.0 GB (55,925,719,040) Contains 554,697 Files EDIT I've just checked and i can delete files... and for every file i delete i appear to be able to create a new one. This definitely points toward a limit in terms of number of files?

    Read the article

  • Should this folder called Data be indexed?

    - by panny
    In the indexing options of Windows 7 there is a folder called Data which is excluded from indexing for the C:\ drive by default. Can someone confirm this, please? I was not able to locate that folder on my drive, nor include it in the search index. The difference in number of indexed files is unsatisfying: windows-7 native indexing service:377703 files on six drives; third party desktop search indexing service:698654 files on the same number of drives. Files in UA Control seem not being indexed without proper priviledges. How can this be circumvented?

    Read the article

  • How to configure a Linux kernel based on the modules currently in use?

    - by Carla
    Hello, I'm willing to build a minimal kernel with only the needed things for my machine; so I started by compiling the kernel from the ground up, using the default configuration and adding things that I know for sure I have (i.e.: Ethernet card, WiFi card, ...). But there are several other things not so easy to know about (i.e.: the watchdog timer) so I came across AutoKernConf which supposedly detects the hardware of the machine and generates a kernel configuration file with the settings for the found devices. The problem is it contained several settings repeated and even some which I don't have (I'm using a Dell laptop and one of the things it "found" was something of a Toshiba one). So I ended up building a kernel with the configuration that came out of the make allmodconfig command, which is a kernel with most of the things compiled as modules. Booting into that kernel and running lsmod I can see all of the kernel modules in use (the ones really needed) and I would like to know if there is a tool or some way for me to parse that list and convert it to the corresponding kernel configuration file. Or how to map each one with the appropriate options in the kernel so that I can manually set them. Thank you very much for your time.

    Read the article

  • Screenflow file type convert to AVI?

    - by Dave
    I've got a couple of large files 2 - 3GB each which were of a training course where the instructor used Screenflow on the Mac to record all his keypresses. I'm currently on a PC.. Problem: how to convert from .screenflow (and associated .scc files) to AVI or something a PC can play? Problem2: If I borrow a Mac can I d/load http://www.telestream.net/screen-flow/overview.htm (which I think was the package) and convert the files?

    Read the article

  • How to share malicious source code?

    - by darma
    I have a client whose site (not one i developed) is infected by a trojan/malicious code. I have asked him to send me the dirty files in a zip but either gmail or unzipping is blocking them. I've tried text files and word files, and i'm suspecting many different file types will be blocked the same way, either by my mail client, anti-malware software, browser etc. (which is normal). Do you know a way he could share those lines so i can read them and do some research about the malicious source code? An image/screenshot of his text editor would be an idea but the files are long + i'd prefer to be able to copy/paste from them. Thank you!

    Read the article

  • What alternatives are available for shared folders encryption in Windows 2003 Server?

    - by snakepitar
    People in our company asked to encrypting some of the shared folders published in a local Windows 2003 File Server. The requirements are: Encrypt the files, so only a user or group or users can open them Avoid password protected files. The encryption process should be transparent to the users Though files are encrypted, the backup software (BackupExec) must be able to copy and access binary for verification Cannot install tools/software in user's PCs, they want this to work automatically As we have very little experience managing servers, we'll be grateful for any help or suggestion offered.

    Read the article

  • Applying ACLs to a Dovecot public namespace

    - by larsks
    I have a public namespace define in my dovecot (dovecot-2.0.9) configuration that looks like this: namespace { type = public separator = . prefix = news. location = maildir:/var/spool/news subscriptions = no } I would like to make all the mailboxes in this namespace read-only. I've got the following configuration for the ACL plugin: plugin { acl = vfile:/etc/dovecot/acls:cache_secs=300 } After perusing the documentation, it seemed as if I had a mailfolder /var/spool/news/.foo.bar that I could place the following into /var/spool/news/.foo.bar/dovecot-acl: anyone rl But that doesn't have any affect. I also tried creating a file /usr/local/etc/dovecot/acls/news.foo.bar with the same contents, but that didn't do anything, either. I've turned on mail debugging: mail_debug = yes But the log doesn't produce anything that appears to be relevant to ACL processing. I'm curious to know if anyone has gotten this to work correctly and if so if you could provide some configuration examples. Also, if there's any way to do this that doesn't involve per-mailbox configuration (.e.g, the ability to apply an ACL to news.* or something), that would be awesome. Getting the documented behavior for default ACLs working would be a step in the right direction.

    Read the article

  • Reduce "Metafile" memory usage?

    - by Jay Conrod
    My work computer (Windows 7 64-bit) spends a lot of time swapping memory when I switch between programs. This surprises me since I have 4 GB of RAM, and the programs I use aren't particularly RAM hungry (Outlook, Emacs, p4win, Firefox, various build tools). I downloaded RAMMap, and it shows over a gigabyte of memory used by "Metafile". From the Sysinternals blog: Metafile is part of the system cache and consists of NTFS metadata. NTFS metadata includes the MFT as well as the other various NTFS metadata files. ... In the MFT each file attribute record takes 1k and each file has at least one attribute record. Add to this the other NTFS metadata files and you can see why the Metafile category can grow quite large on servers with lots of files. So I understand what the "Metafile" data is... I work on large builds comprising hundreds of thousands of files (none are that big, but they add up to several gigabytes). My question is how can I reduce the amount of memory used by "Metafile"? I'm not actively using all those files at once, so why does Windows need to keep info in RAM? Restarting my machine every time I sync a new build is really annoying.

    Read the article

  • Read floppy from OpenVMS machine

    - by Goyuix
    I have a floppy I need to read the contents from - unfortunately it was formatted and the data written on an OpenVMS server. I believe the floppy is formatted "Files-11" and I can see parts of the MFT [equivalent] and file contents through a hex editor, however I would love to be able to mount this and actually read the files off. Is there a Files-11 FUSE module or other kernel module I can install to read this format? Any standalone utilities that can understand a floppy image taken with dd?

    Read the article

  • Any script or command line tool that will allow me to sync a local folder with webdav?

    - by daniels
    I have a local folder on my mac that I want to sync with a webdav server. There are a lot of files in my folder and I would need that after I edit some files or add/remove folder to be able to sync the changes to the webdav server ignoring what it is on the server and always using my files. Is there any script or tool that I can use from command line to do that? And mounting the resource is not a solution.

    Read the article

  • Speed up file access on home network

    - by kurasa
    I have 2 PCs (Windows 7 Ultimate) and a Mac running Windows 7 using vmware fusion on my home network tied together using WRN1000 NETGEAR Router On one of the PC's I have a set of file (MYOB .myo). These use a data source to access the data in the files. Operations (reading,writing) to the .myo on the PC which hosts the files is fine but the other 2 it is painfully slow/unreliable and I am wondering what I can do to speed this up. Some ideas I have are 1. Turn off the Windows firewall on all the windows installations on the home network 2. Buy another router. Specifically a router which I can connect a USB flash drive on the back where I can put the .myo files and all the PC can access the files from the USB flash drive on the router (does this speed things up?) Any advice greatly appreciated on how I can speed up this access to data

    Read the article

< Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >