Search Results

Search found 26454 results on 1059 pages for 'post parameter'.

Page 339/1059 | < Previous Page | 335 336 337 338 339 340 341 342 343 344 345 346  | Next Page >

  • deploy git project and permission issue

    - by nixer
    I have project hosted with gitolite on my own server, and I would like to deploy the whole project from gitolite bare repository to apache accessible place, by post-receive hook. I have next hook content echo "starting deploy..." WWW_ROOT="/var/www_virt.hosting/domain_name/htdocs/" GIT_WORK_TREE=$WWW_ROOT git checkout -f exec chmod -R 750 $WWW_ROOT exec chown -R www-data:www-data $WWW_ROOT echo "finished" hook can't be finished without any error message. chmod: changing permissions of `/var/www_virt.hosting/domain_name/file_name': Operation not permitted means that git has no enough right to make it. The git source path is /var/lib/gitolite/project.git/, which is owned by gitolite:gitolite And with this permissions redmine (been working under www-data user) can't achieve git repository to fetch all changes The whole project should be placed here: /var/www_virt.hosting/domain_name/htdocs/, which is owned by www-data:www-data. What changes I should do, to work properly post-receive hook in git, and redmine with repository ? what I did, is: # id www-data uid=33(www-data) gid=33(www-data) groups=33(www-data),119(gitolite) # id gitolite uid=110(gitolite) gid=119(gitolite) groups=119(gitolite),33(www-data) does not helped. I want to have no any problem to work apache (to view project), redmine to read source files for project (under git) and git (doing deploy to www-data accessible path) what should I do ?

    Read the article

  • Why does the wireless network icon have a red X over it when everything seems to work?

    - by Kristo
    I booted my almost brand new laptop running Windows 7 this morning and noticed a red X through the wireless networking icon in the system tray. At first I thought something was wrong, but clicking on it shows a good connection to my wireless network. I had no problem getting here to post this question. I'm very new to Windows 7 so I have no idea how to troubleshoot this myself. Is there an actual problem here? Can I fix the icon so it doesn't falsely display an error (I assume that's what the red X means)? Here's what I know: I can get here to post this question. There's at least one unsecured network available that I'm not connected to. I can see a bunch of wireless networks, presumably from my neighbors' houses. There are no other computers turned on in my house right now. The device manager shows no problems with any devices. I can ping my default gateway, DNS, and yahoo.com with no problem.

    Read the article

  • UNIX tool to dump a selection of HTML?

    - by jldugger
    I'm looking to monitor changes on websites and my current approach is being defeated by a rotating top banner. Is there a UNIX tool that takes a selection parameter (id attribute or XPath), reads HTML from stdin and prints to stdout the subtree based on the selection? For example, given an html document I want to filter out everything but the subtree of the element with id="content". Basically, I'm looking for the simplest HTML/XML equivalent to grep.

    Read the article

  • Limit NFS block size from server side?

    - by paulw1128
    Is it possible to enforce a maximum rsize/wsize in nfsd? I'm having issues related to IP fragmentation (yes, I'm stuck with NFS-over-UDP, contrary to the warnings in the manpage), and have no practical access to the client mount command (buried in one of many TFTP boot images). http://nfs.sourceforge.net/nfs-howto/ar01s05.html lists a kernel source parameter limiting the maximum block size, but I'm not gong to get away with recompiling the nfsd kernel module so that's not really an option either :-(

    Read the article

  • Host is missing hostname and/or domain

    - by anlawang
    i use puppet 0.25.4 on ubuntu 10.04,when puppet installed ,i got the infor below : Nov 29 10:30:30 puppet puppetmasterd[4422]: Host is missing hostname and/or domain: pclient.example.com Nov 29 10:30:30 puppet puppetmasterd[4422]: Compiled catalog for pclient.example.com in 0.02 seconds i dont know how to fix it ,who can help me thank you ! my configuration : I use apt-get to install the puppet,so some configuration have been fixed puppet.conf on client : > [main] server=puppet.example.com > logdir=/var/log/puppet > vardir=/var/lib/puppet > ssldir=/var/lib/puppet/ssl > rundir=/var/run/puppet > factpath=$vardir/lib/facter > pluginsync=false > templatedir=$confdir/templates > prerun_command=/etc/puppet/etckeeper-commit-pre > postrun_command=/etc/puppet/etckeeper-commit-post > certname=pclient.example.com > node_name=cert [puppetd] > runinterval=30 puppet.conf on server: > [main] logdir=/var/log/puppet > vardir=/var/lib/puppet > ssldir=/var/lib/puppet/ssl > rundir=/var/run/puppet > factpath=$vardir/lib/facter > pluginsync=true > templatedir=$confdir/templates > prerun_command=/etc/puppet/etckeeper-commit-pre > postrun_command=/etc/puppet/etckeeper-commit-post i user the default node on site.pp i am a newer to puppet,so i dont know the reason for these problems!! thank you again!!!

    Read the article

  • How to make xvnc not kill the session on exit

    - by Cem
    Hello, I'm implementing a remotedesktop access to a server thru xvnc/xinetd/gdm. I'd like many users to connect to that server using vnc (thus providing the gdm login screen) and want that if the xvnc session is closed, it would 'xlock' the session so next time user connects it will resume his session. Tried several parameter tweaks, but unfortunately each time the vnc viewer is closes, the X session is also destroyed. Help/clues would be really appreciated.

    Read the article

  • How to disable automatic and forced fsck on disks in a linux software raid?

    - by mit
    This is the /etc/fstab entry of a raid system /dev/md4 that is controlled with mdadm and webmin on an ubuntu 10.04 64 server: /dev/md4 /mnt/md4 ext3 relatime 0 0 We tried to switch off automatic forced fsck on rebotts, as we prefer to implement our own scheduled fsck routine by setting the last parameter of the line to 0 (ZERO). But we found out the forced and automatic check still occurs on the underlying real disks, lets say sdb1 and sdc1. How can we switch that off?

    Read the article

  • IIS7 FTP7.5 and FXP

    - by cralexns
    I have a FTP7.5 on my Windows 2008 server and would like to use this for FXP, however regardless of the other server in question I can never complete a transfer. It always fails with this message: [L] 501-Server cannot accept argument. [L] Win32 error: The parameter is incorrect. [L] Error details: Client IP on the control channel didn't match the client IP on the data channel. I've tested normal use and it works fine, is there any setting I can change to allow for FXP?

    Read the article

  • Need help ttoubleshooting PC

    - by brux
    I have had problems since my dog pee'd on my computer. Problem: loads windows fine, at random intervals from 5 minutes to 30 minutes it restarts itself. There is nothing in the event log such as errors, no BSOD, just cold restart. after rstarting - sometimes- it POST's and restarts itself at the end of POST. It will do this many times and then finally load windows. The cycle then begins again, it will restart eventually. What i have done: I thought it was HDD at first, since this is the only part of the coputer which actually got wet with any fluid ( the case is off the PC and the dog pee'd down the front where the HDD is located). Seatool, the seagate HDD tool, found errors when I ran it inside windows, so I ran it in DOS mode from bootable USB and ran it. It found the same number of errors and fixed them all. I ran the scan again and it says "Good". I loaded windows and ran the scan and it also said "Good there. So the HDD apears to be fine but the problem persists, random restarts. What else could this be? I have taken the computer apart and cleaned everything and also taken the PSU apart and cleaned it thoughrouly. The problem still persists, what should my next steps be? Thanks in advance.

    Read the article

  • Custom settings with PHP on Nginx

    - by miki
    I have multiple websites setup using Nginx and Apache, but when I try to add a vhost based PHP directive using fastcgi_param PHP_VALUE , the value get added to all of my vhosts. Checking the value of the PHP parameter from PHP-CLI remain the actual one for whole server. e.g. I used fastcgi_param PHP_VALUE "memory_limit=512M" for a domain based nginx config, but it will be propagated to all domains on server. The 'php -i | grep memory_limit' show 128M though Not sure what I am missing

    Read the article

  • Nginx conditional not evaluating correctly

    - by cjc
    I'm running into a weird problem with nginx and how it evaluates conditionals. Here's the relevant configuration: set $cors FALSE; if ($http_origin ~* (http://example.com|http://dev.example.com:8000|http://dev2.example.com)) { set $cors TRUE; } if ($request_method = 'OPTIONS') { set $cors $cors$request_method; } if ($cors = 'TRUE') { add_header 'Access-Test' "$cors"; add_header 'Access-Control-Allow-Origin' "$http_origin"; add_header 'Access-Control-Allow-Methods' 'POST, OPTIONS'; add_header 'Access-Control-Max-Age' '1728000'; } if ($cors = 'TRUEOPTIONS') { add_header 'Access-Test' "$cors"; add_header 'Access-Control-Allow-Origin' "$http_origin"; add_header 'Access-Control-Allow-Methods' 'POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'X-Requested-With, X-Prototype-Version'; add_header 'Access-Control-Max-Age' '1728000'; add_header 'Content-Type' 'text/plain'; } So, the conditional blocks never trigger. When I remove the conditions, I see that the "Access-Test" header and the "Access-Control-Allow-Origin" set correctly, but, as noted, enabling the conditionals causes the headers not to be sent. I'm testing by running: curl -Iv -i --request "OPTIONS" -H "Origin: http://example.com" http://staging.example.com/ Am I missing something obvious? I've tried the "if" with and without quotes, etc. This is nginx 1.2.9.

    Read the article

  • nginx url rewrites for php-urls

    - by Tronic
    i have to permament redirect some old urls in nginx. the old urls are old-style php urls including a parameter for loading content. they look like this: http://www.foo.com/index.php?site=foo http://www.foo.com/index.php?site=bar i want to redirect them to other urls like: http://www.foo.com/news http://www.foo.com/gallery any advice on how i can achieve this? my tries failed. thanks in advance!

    Read the article

  • Freebsd ports installing with parameters

    - by DeaglinG
    I'm trying to create a install script for some FreeBSD machines. I want to install a few ports but change several configurations before installing, and without the dialog popup. I've tried almost everything but no success. I.E: I'm trying to install nginx with HTTP_SSL module but i can't seem to pass the correct parameter to the make install clean command. I also want to use all other default settings, and only change this one. Any help will be appreciated.

    Read the article

  • How can I run a PowerShell function remotely?

    - by Aimar
    Using powershell, I plan to run many functions on a remote host to gather information. Here is an example to retrieve the content of file remotely just by running a function called getcontentfile with the parameter as the name of the remote host: function getcontentfile { [CmdletBinding()] param($hostname) $info = Get-Content "C:\fileinfo.xml" write-host $info } This function should return information about the remote host to the local instance of PowerShell. How can I modify this script to do that?

    Read the article

  • Is it possible to create a self-signed intermediate CA for ssl?

    - by limilaw
    I am trying to create my own SSL hierarchy like: MyRootCA --MyIntermediateCA ----MyCert I have installed MyRootCA and MyIntermediateCA, but windows points out that MyIntermediateCA doesn't have the right to issue certs. Therefore it invalidates MyCert. i.stack.imgur.com/XDtXp.png i.stack.imgur.com/rZNQZ.png I am using sign.sh from mod_ssl package, which utilizes openssl ca command. I wonder if there is any parameter/option that grants MyIntermediateCA the right to issue sublevel certs?

    Read the article

  • What method of MySQL mirroring should I use for this?

    - by user45745
    I'm running an web application hosting service (basically hosting forums for free), and I have two remote servers at my disposal. The code for the application is stored on both servers and isn't a problem, but I'm wondering how to deal with the databases. When someone goes onto a site *.example-host.com, they are sent to one of the two servers and both must be capable of loading the forums from a database. The database must also have write access, for when new members register or post topics etc. The main requirement is speed, but uptime is also important (if a server goes out, the site should still work). I have a few options, but I'm inexperienced and not sure which to go with: 1) [PHP] Split the forum records 50:50 between the two servers. If a server does not have the record for a forum requested, it can request it from the other by remote MySQL and load it. This idea sounded okay, until I realised that 50% of the time, users would be waiting significantly longer for pages to load. I also realised that if one of the servers went down, half the forums would be inaccessible and registrations would have to be disabled. 2) [MySQL] Dual master replication. This would attempt to mirror the two databases and sounds perfect, but I've heard that it can be very problematic. I don't know how fast this is. 3) [MySQL] Use a standard replication, distribute read only queries on both nodes and read/write queries to the master. This sounds like a good option, but again, I'm not sure on speed. I also don't know what would happen if the master server went down. If you have any other suggestions, please post them :)

    Read the article

  • Amazon S3 Iterating Through Multi-Page Results. (withMarker)

    - by Jitu
    Trying to iterate through AmazonS3 that has around 5000+ keys stored in the bucket, used sample code based on provided link on Amazon Developer Guide http://docs.amazonwebservices.com/AmazonS3/latest/dev/ListingObjectKeysUsingNetSDK.html Issue is iteration fails when NexMarker is passed which has length of more than 128 string characters, which seems unusal as withMarker accepts string as parameter and there is no documentation on limit to withMarker. request.Marker = response.NextMarker; Has anyone faced similar issue. Thanks in advance.

    Read the article

  • Why can I not edit, delete directories inside of this directory

    - by user43053
    Hello there, First, I thought this was PHP related, but maybe it isn't. My original post, which may be irrelevant now is located at the bottom. The problem is I have a directory : /articles/. In it are 10 sub directories. I have been changing the permissions lately, but now it seems all the permissions of the parent folder, sub-folders and files are either chmod 755 or 777. I cannot move, delete or edit files inside of this parent directory or sub-directories with my FTP-client. I can however edit, delete, create new files and directories and change them with PHP-functions without problems. What may the problem be? OLD POST. Ignore everything below this line: If I create a directory with mkdir(), or create a file with fopen(), file_put_contents() or SimpleXMLElement::asXML(), I am unable to access the file with my FTP-client or c-Panel File Manager. If I try to delete or edit them, I get errors. Dreamweaver suggests it is a permission problem or a network or filesystem fault (but I've set the permissions with chmod() to 0777, and when I check the cPanel, it confirms chmod 777. I also tried to use fileowner() and the function returns int(99), the same owner as those files that I could access with my FTP-client. It seems files and directories created with PHP can only be modified or be deleted with PHP. I thought this must be a server setup related issue, so I write it here. I am on a shared server, and I have no idea about setting up servers. EDIT: It seems the problem is different. I cannot move files with FTP-client to the parent, or sub-directories either. This problem may not be PHP related, then. It seems the problem applies to any directory, regardless of whether it was created by PHP. EDIT 2: The parent directory has chmod 755. Thank you for your time. Kind regards Marius

    Read the article

  • Computer is dying--what should I be looking for?

    - by Will
    Okay, I'm a bit knowledgeable with pooters and such, but i'm confused. My computer is dying slowly, and I'm not sure what part is causing this. Computer details: Vista, dell machine, intel Q6600, 2.4 Core Duo (quad core), standard memory and drive (unknown manufacturer). Symptoms: I would best describe the symptoms as memory corruption. After a couple days on, I start getting applications crashing or failing to open for a lack of "resources". Sounds are corrupted. Onscreen text gets corrupted; the characters of text are garbled, not the pixels on the screen. Video memory seems untouched as I haven't seen any misplaced pixels. Recently I've lost files on disk. I've also experienced errors reporting a supposed lack of disk space, even though I have fifty gigs free. There was one point where I couldn't get to the POST when booting up. After I cleaned everything (see next) this hasn't happened. Diagnostic steps: First thing I did was clean the case. There was a lot of dust buildup on heatsinks, so I cleaned all that up. No help. Next, I disconnected and reconnected everything, from power cables to memory (did not reseat cpu). No change. Last, I ran the standard vista memory diagnostics and ran checkdisk. Both reported no errors found. I have not run any POST tests, now that I think about it. I'm at a loss at this point. Disk appears fine, memory too. I'd expect motherboard issues to result in the thing not booting up, yet it does every time. What should I be looking at? What more can I do?

    Read the article

  • How can I get (g)Vim to display the character count of the current file?

    - by OwenP
    I like to write tutorials and articles for a programming forum I frequent. This forum has a character limit per post. I've used Notepad++ in the past to write posts and it keeps a live character count in the status bar. I'm starting to use gVim more and I really don't want to go back to Notepad++ at this point, but it is very useful to have this character count. If I go over the count, I usually end up pasting the post into Notepad++ so I can see when I've trimmed enough to get by the limit. I've seen suggestions that :set ruler would help, but this only gives the character count via the current column index on the current line. This would be great if I didn't use paragraph breaks, but I'm sure you'd agree that reading several thousand characters in one paragraph is not comfortable. I read the help and thought that rulerformat would work, but after looking over the statusline format it uses I didn't see anything that gives a character count for the current buffer. I've seen that there are plugins that add this, but I'm still dipping my toes into gVim and I'm not sure I want to load random plugins before I understand what they do. I'd prefer to use something built in to vim, but if it doesn't exist it doesn't exist. What should I do to accomplish my goal? If it involves a plugin, do you use it and how well does it work?

    Read the article

  • Random HTTP 413 error on apach2/php/joomla site

    - by jfab
    I have a Joomla site, and every once in a while when I submit something via a form, I get a HTTP 413 error: Request Entity Too Large The requested resource /index.php does not allow request data with POST requests, or the amount of data provided in the request exceeds the capacity limit. In the error.log file I get: Invalid Content-Length, referer: [site]/index.php It doesn't seem this has anything to do with the actual size of the request, for the following reasons: a) I tinkered with the configuration of both Apache, and PHP. In Apache I tried increasing LimitRequestBody, and in PHP post_max_size, max_input_vars, memory_limit, and even upload_max_filesize. Every value is far beyond what is sent in a typical request that generates an error. b) The error pops up quite randomly, and often just hitting refresh allows me to get through. c) I checked the request in Fiddler to make sure everything is right with the content-length stated in the header, and the content of the request itself. Everything appears to be in order. A curious thing is that when I resent the exact same request via Fiddler, I never got the error. It seems I can only recreate it through a browser. So I'm at my wit's end here. I don't even know where to look for the problem anymore. I don't know if it's Apache or PHP (though I can't find anything in PHP error logs, so maybe that means Apache is the more likely culprit?), or PHP in general, or my Joomla site in particular (my bets were on Joomla until a recreated the error on a test script, with a very basic post form, though it does pop up much more often on the Joomla site). If anyone can give any advice on where to even begin with this, I'll be very grateful!

    Read the article

  • What happened to 'Copy Public Gallery Link' for folders in Photos in Dropbox?

    - by Transgenic
    I've been trying to figure out how to use the 'Photos' folder properly, but it does not seem to contain the functionality that is mentioned on various websites. From what I've read, you are supposed to create a folder within the 'Photos' folder. At that point, it becomes a public gallery for which you can access the context menu item Copy Public Gallery Link. However, I do not have this context menu item. I found a topic on the DropBox forum, but it is 2 years old. Even the various website information mentioned is over a year old. I found a topic from this year on SuperUser that mentions about the Share link menu item. I understand that this functionality is basically the same as Copy Public Gallery Link, however, the key difference is that it opens the website for which I need to click 'Copy link to this page' link, which then adds it to the clipboard. My primary questions are: Did they phase out the Copy Public Gallery Link from the context menu entirely? If not, is there a way to re-enable this menu item? If not, am I using the Photos folder properly? Lastly, if there is no way to get this menu item, is there some kind of script or hack-around that will let me simplify the Shark link + Copy link to this page functionality without requiring it to open my web browser? Note: I had to edit my post to remove other website links that I mention since I'm considered a 'New User' here and am only able to post 2 hyperlinks. Sorry.

    Read the article

  • Postfix $smtpd_banner rules

    - by horen
    For monitoring purposes I would like to add the IP address to the Postfix smtpd_banner: smtpd_banner = $myhostname ESMTP $smtp_bind_address which works and outputs: 220 mail.mydomain.com ESMTP 123.456.789.0 Now I am wondering if there are any (negative) repercussions to expect. I couldn't find anything about it in the RFC docs. The Postfix docs add another parameter ($mail_name) in their example, so I think I am fine. I just want to verify that my syntax is correct and is allowed.

    Read the article

< Previous Page | 335 336 337 338 339 340 341 342 343 344 345 346  | Next Page >