Search Results

Search found 70459 results on 2819 pages for 'file sync'.

Page 412/2819 | < Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >

  • Can I choose a sparse file as vdev for a zfs pool?

    - by Karl Richter
    man zpool states that a vdev for a zfs pool can be a "regular file". Can I specify a sparse file (the warning about the integrity of the file being determined by the underlying filesystem should apply with the same relevance for a sparse file)? The ZFS administration guide on https://pthree.org/2012/12/04/zfs-administration-part-i-vdevs/ states that file vdevs "must be preallocated, and not sparse files or thin provisioned" (thanks to @jlliagre). On https://wiki.archlinux.org/index.php/Experimenting_with_ZFS sparse files are used without any comment.

    Read the article

  • Windows: Making Windows Explorer distinctive (changing background color or file/folder icons) for specific folder

    - by MacGyver
    Is there a way to change the background color in Windows 7 and Windows Server 2008 when the current folder being displayed meets a certain condition? Or is there a way I can change the icons of the files and folders within that folder so it's distinctive--similar to how Tortoise SVN does it for code checked out from a repository? Why? I'd like to do this for a deployment directory on a live server so users don't accidentally commit code to a certain environment. Like myself. :-)

    Read the article

  • Need data on disk drive management by OS: getting base I/O unit size, "sync" option, Direct Memory A

    - by Richard T
    Hello All, I want to ensure I have done all I can to configure a system's disks for serious database use. The three areas I know of (any others?) to be concerned about are: I/O size: the database engine and disk's native size should either match, or the database's native I/O size should be a multiple of the disk's native I/O size. Disks that are capable of Direct Memory Access (eg. IDE) should be configured for it. When a disk says it has written data persistently, it must be so! No keeping it in cache and lying about it. I have been looking for information on how to ensure these are so for CENTOS and Ubuntu, but can't seem to find anything at all! I want to be able to check these things and change them if needed. Any and all input appreciated.

    Read the article

  • Cisco Router - Add a missing MIB file

    - by Jonathan Rioux
    I have a Cisco 881w, and I would like to setup NBAR in my NetFlow Analyzer. But it says that my router misses this MIB in order to allow NFA to poll the router with snmp to get NBAR infos. From the FAQ page of the NetFlow Analyzer website, it responds to my error: Q. I am able to issue the command "ip nbar protocol-discovery" on the router and see the results. But NFA says my router does not support NBAR, Why? A. Earlier version of IOS supports NBAR discovery only on router. So you can very well execute the command "ip nbar protocol-discovery" on the router and see the results. But NBAR Protocol Discovery MIB(CISCO-NBAR-PROTOCOL-DISCOVERY-MIB) support came only on later releases. This is needed for collecting data via SNMP. Please verify that whether your router IOS supports CISCO-NBAR-PROTOCOL-DISCOVERY-MIB. The missing MIB is: CISCO-NBAR-PROTOCOL-DISCOVERY-MIB I found it here: ftp://ftp.cisco.com/pub/mibs/v2/CISCO-NBAR-PROTOCOL-DISCOVERY-MIB.my But how can I add this MIB into the router? The IOS of my router is: c880data-universalk9-mz.151-3.T1.bin

    Read the article

  • Host timeout during file upload/download over SFTP/SSH

    - by kritop
    I tried different clients because i thought its client related, but all of them sooner or later disconnect or stop uploading/downloading files, and getting a timeout disconnect. After a reconnect it works again for a bit of time. Really strange cannot figure out the reason. I'm on a mac and the server is a debian VPS! If u need further informations ask please! I appreciate any tips, because i'm kinda stuck!

    Read the article

  • Why Ubuntu could treat hosts file so strange?

    - by z4y4ts
    I have almost fresh Ubuntu desktop box. OS was installed two weeks ago and updated from karmic repositories. Last week I had no problems with DNS. But this week something had changed. I'm not sure what and when, and not sure whether I changed any configs. So now I have some really weird situation. According to logs name resolving should work normally. /etc/hosts 127.0.0.1 localhost test 127.0.1.1 desktop /etc/host.conf order hosts,bind multi on /etc/resolv.conf # Generated by NetworkManager search search servers obtained via DHCP nameserver 192.168.0.3 /etc/nsswitch.conf passwd: compat group: compat shadow: compat hosts: files mdns4_minimal [NOTFOUND=return] dns mdns4 networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis But if fact it is not. user@test ~ping test PING localhost (127.0.0.1) 56(84) bytes of data. [skip] Pinging is ok. user@test ~host test test.myviacube.com has address xx.xxx.161.201 But pure I suspect that NetworkManager might cause this misbehavior, but don't know where to start to check it. Any thoughts, suggestions?

    Read the article

  • .NET not processing an XML file in IIS

    - by Stuart McIntosh
    We have 2 servers, 1 already configured with .net which works fine and a new one which appears to be configured the same but when I open an xml page in Internet Explorer it complains about the <% tag. We have IIS on win srvr 2003 SP2. The website is configured with .NET 1.1.4322. In ISAPI extensions have set the .XML extension to use c:\windows\microsoft.net\framework\v1.1.4322\aspnet_isapi.dll But the page: <property name="documentmaxage" value="0"/> <property name="documentmaxstale" value="0"/> <var name="m_Prompt_Path" /> <form id="InitVoiceXmlDoc"> <block> <assign name="m_Prompt_Path" expr="&quot;<% Response.Write(Request.QueryString["m_Prompt_Path"]); %>&quot;"/> </block> </form> gives the error: The XML page cannot be displayed Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later. The character '<' cannot be used in an attribute value. Error processing resource 'http://localhost:11119/fails.xml'. Lin... &quo... We have the same config on another server which works fine. So are there other options apart from the ISAPI extensions that I need to look at. If I suffix the page .aspx, of course it works fine.

    Read the article

  • Apache server doesn't create directory or file under www-data user [duplicate]

    - by Harkonnen
    This question already has an answer here: What permissions should my website files/folders have on a Linux webserver? 4 answers very newbie to Apache here I installed Apache 2.4 on my Arch server where I installed newznab (a newsgroups indexer). I have noticed that all files newznab needs to create are created under my login user, and not apache default user (www-data). I read here that it's bad security practice to allow www-data to write files. I agree. But as an apache newbie, I would like to know where (in the httpd.conf I suppose ?) the user allowed to write files can be configured, because I want another account to be allowed to write files instead of my main account.

    Read the article

  • Using NOPASSWD for specific commands in sudoers file, PASSWD for all others

    - by jberryman
    I would like to configure sudo such that users can run some specific commands without entering a password (for convenience) and can run all other commands by entering a password. This is what I have, but this does not work; a password is always required: Defaults env_reset Defaults timestamp_timeout = 1 root ALL=(ALL:ALL) ALL # Allow members of group sudo to execute any command %sudo ALL=(ALL:ALL) NOPASSWD: /usr/sbin/pm-suspend, /usr/bin/apt-get, PASSWD: ALL #includedir /etc/sudoers.d Note that this is a debian system which uses this adding users to the "sudo" group method. Thanks.

    Read the article

  • Wildcard subdomain to file htaccess

    - by Mikkel Larson
    I've have a problem with a htaccess wildcard redirect My base configuration is set to work with: www.domain.com and domain.com this is governed by 2 .htaccess files: 1: /home/DOMAIN/public_html/.htaccess AddDefaultCharset utf-8 RewriteEngine on RewriteCond %{HTTP_HOST} ^(www.)?festen.dk$ RewriteCond %{REQUEST_URI} !^/public/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /public/$1 RewriteCond %{HTTP_HOST} ^(www.)?festen.dk$ RewriteRule ^(/)?$ public/index.php [L] 2: /home/DOMAIN/public_html/public/.htaccess AddDefaultCharset utf-8 <IfModule mod_rewrite.c> Options +FollowSymLinks RewriteEngine On </IfModule> <IfModule mod_rewrite.c> RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php/$1 [L] </IfModule> Now i want to redirect: www.[SUBDOMAIN].domain.com/[PATH] and [SUBDOMAIN].domain.com/[PATH] to public/index.php/subdomaincontroller/realsubdomain/[PATH] My solution so far: Added following to 2: /home/DOMAIN/public_html/public/.htaccess <IfModule mod_rewrite.c> RewriteCond %{HTTP_HOST} !www.domain.com$ [NC] RewriteCond %{HTTP_HOST} ^(www.)?([a-z0-9-]+)domain.com [NC] RewriteRule (.*) subdomaincontroller/realsubdomain/%2/$1 [L] </IfModule> Sadly this dows not work. Can anyone help me please?

    Read the article

  • Webserver security, intrusion detection, and file intregrity

    - by enfield
    I would like to add some type of tracking / alerting on some linux webservers running PHP and Apache. In doing searches I have come across a lot of info from 2006-2009. Would like to revisit things and see what others are doing now. The main purpose here is to track when any files are changed and if so alert me somehow. The same goes for IDS and hopefully something that can reside on same server? Since some of these are small scale projects I would prefer opensource/free solutions that are really effective. Although I would still like to hear of other alternatives if someone has the experience and the cost can be justified.

    Read the article

  • Best tools for "ssh tail -f" style log file monitoring and analysis

    - by dougnukem
    I'm looking for a tool to monitor custom PHP Error logs/Apache and possibly Java logs on remote development servers. I'm not looking for a full production log system like Splunk, but something that's a little more flexible than an ssh terminal doing a "tail -f". Perhaps something that will: * Monitor multiple log files to my local machine for searching/analysis later * Allow "alerts" when certain strings appear in the log * Provide some kind of tabbed/dashboard view of the multiple logs being monitored (in total less than 10 logs).

    Read the article

  • Resizing mysterious partition written by DDing an ISO file

    - by Jon
    I downloaded clonezilla and then wrote it to a USB flash drive with this: dd if=clonezilla.iso of=/dev/sdb I've confirmed that the system boots and clonezilla runs from the flash drive. I want to store a clonezilla backup on the same flash drive clonezilla is running on, but I tried it and ran out of space, so I started looking at how to resize the mysterious partition type that was generated from the ISO. fdisk -l /dev/sdb .... Device Boot Start End Blocks Id System /dev/sdb1 * 1 111 113664 17 Hidden HPFS/NTFS .... I've tried using ntfsresize from the Debian ntfsprogs package. I'm trying gparted next, but thought I'd ask here if anyone knows a neat way to resize a partition created on flash from a liveCD image. Thanks in advance Jon ps. Assume Debian 6 please.

    Read the article

  • DFS-R (2008 and R2) 2 node server cluster, all file writes end in conflictAndDeleted

    - by Andrew Gauger
    Both servers in a 2 server cluster are reporting event 4412 20,000 times per day. If I sit in the conflictAndDetected folder I can observe files appearing and disappearing. Users report that their files saved by peers at the same location are overriding each other. The configuration began with a single server, then DFS-R was set up using the 2008 R2 wizard that set up the share on the second server. DFSN was set up independently. Windows users have drives mapped using domain based namespace (\domain.com\share). Mac users are pointed directly to the new server share created by DFS-R. It is PC users indicating most of the lost files, but there has been 2 reports by Mac users about files reverting.

    Read the article

  • Time sync fails on Hyper-V VM, but succeeds when I log in as a domain user

    - by Richard Beier
    We have a Windows Server 2003 SP2 VM running on Hyper-V (Server 2008 R2 host). The VM has Hyper-V time synchronization enabled. I noticed that the time on the VM was fast by around 25 minutes. I saw the following in the event log: The time provider NtpClient is configured to acquire time from one or more time sources, however none of the sources are currently accessible. No attempt to contact a source will be made for 15 minutes. NtpClient has no source of accurate time. The time provider NtpClient cannot reach or is currently receiving invalid time data from ourdc.ourdomain.local (ntp.d|192.168.2.18:123-192.168.2.2:123). Time Provider NtpClient: No valid response has been received from domain controller ourdc.ourdomain.local after 8 attempts to contact it. This domain controller will be discarded as a time source and NtpClient will attempt to discover a new domain controller from which to synchronize. I had been logged in as a local user. (We have an old app that runs on this VM - it requires a user to be logged in at all times, and we use a non-domain user account for this.) When I logged in as a domain user, the clock almost immediately corrected itself. Running "w32tm /monitor" and "net time" as the domain user showed no errors, and indicated that our domain controller was the time source. Does anyone know what might cause this, and why logging in under a domain account fixes the problem? I'm wondering if the time will start to drift again. Thanks for your help, Richard

    Read the article

  • My .htaccess file re-directed problems?

    - by Glenn Curtis
    I am hoping you can help me! Below is my .htaccess files for my Apache server running on top of Ubuntu server. This is my local server which I installed so I can develop my site on this instead of using my live site! However i have all my files and the database on my localhost now but each time I access my server, vaio-server (its a sony laptop), it just takes me to my live site! Now eveything is in the root of Apache, /var/www - its the only site I will develop on this system so I don't need to config this to look at any many than this one site! I think thats all, all the Apache files, site-available/default ect are as standard. - Please Help!! Many Thanks Glenn Curtis. DirectoryIndex index.php index.html # Upload sizes php_value upload_max_filesize 25M php_value post_max_size 25M # Avoid folder listings Options -Indexes <IfModule mod_rewrite.c> Options +FollowSymlinks RewriteEngine on RewriteBase / # Maintenance #RewriteCond %{REQUEST_URI} !/maintenance.html$ #RewriteRule $ /maintenance.html [R=302,L] #Redirects to www #RewriteCond %{HTTP_HOST} !^vaio-server [NC] #RewriteCond %{HTTPS}s ^on(s)|off #RewriteRule ^(.*)$ glenns-showcase.net/$1 [R=301,QSA,L] #Empty string RewriteRule ^$ app/webroot/ [L] RewriteRule (.*) app/webroot/$1 [L] </IfModule>

    Read the article

  • Windows 2008 as home file server and more

    - by Christian W
    I currently have a freenas-unit as a NAS, and a Win2k8R2-unit as server. However I would like to consolidate these to units in one. What I really like about the freenas-unit is the ZFS filesystem. And the only reason I care about the ZFS filesystem is the easy way I can grow an existing filesystem just by inserting a new drive. How would this work in Win2k8? If I setup my unit with a separate drive as C: and a 1TB drive as D:. The D: would then be segmented into d:\Videos d:\Music d:\Pictures. When everything gets close to filling the storage-drive, I would like to expand the storage, but I wouldn't want to have E:\Videos or d:\Videos2 (using the NTFS folder mount thingy). I still want all my Videos to reside in D:\Videos and I want the OS to decide which drive it's going to be stored on... Some kind of on-the-fly jbod expansion :) Is this at all possible in Windows 2008?

    Read the article

  • pfsense log file retention

    - by Colin Pickard
    We have a pfSense firewall in our datacentre. By default, pfSense is only storing 500K of firewall filter logs, which is only a few hours for us. How can I increase this? pfSense uses clog rather than the usual BSD newsyslog. I only want the log for debugging firewall rules, not compliance or anything, and the firewall has 100GB of spare disk space, so I'd rather have the logs on the firewall itself than set up a syslog server.

    Read the article

  • How can I store all my level data in a single file instead of spread out over many files?

    - by Jon
    I am currently generating my level data, and saving to disk to ensure that any modifications done to the level are saved. I am storing "chunks" of 2048x2048 pixels into a file. Whenever the player moves over a section that doesn't have a file associated with the position, a new file is created. This works great, and is very fast. My issue, is that as you are playing the file count gets larger and larger. I'm wondering what are techniques that can be used to alleviate the file count, without taking a performance hit. I am interested in how you would store/seek/update this data in a single file instead of multiple files efficiently.

    Read the article

  • How to get LAN ip to a variable in a Windows batch file

    - by Ville Koskinen
    I'm streaming audio from my Windows 7 laptop to a sound card attached to a router. I have a little batch script to start streaming. REM Kill any instances of vlc taskkill /im vlc.exe "c:\Program Files\VideoLAN\VLC\vlc.exe" <parameters to start http streaming> REM Wait for vlc TIMEOUT /T 10 REM start playback on router plink -ssh [email protected] -pw password killall -9 madplay plink -ssh [email protected] -pw password wget -q -O - http://192.1.159:8080/audio | madplay -Q --no-tty-control - & As you see the http stream is hard coded. It would be nice to get the address dynamically to reuse the script on other machines. Any ideas?

    Read the article

< Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >