Search Results

Search found 21235 results on 850 pages for 'www'.

Page 197/850 | < Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >

  • Mercurial says "nothing changed", but it did. Sometimes my software is too clever.

    - by user12608033
    It seems I have found a "bug" in Mercurial. It takes a shortcut when checking for differences in tracked files. If the file's size and modification time are unchanged, it assumes its contents are unchanged: $ hg init . $ cp -p .sccs2hg/2005-06-05_00\:00\:00\,nicstat.c nicstat.c $ ls -ogE nicstat.c -rw-r--r-- 1 14722 2012-08-24 11:22:48.819451726 -0700 nicstat.c $ hg add nicstat.c $ hg commit -m "added nicstat.c" $ cp -p .sccs2hg/2005-07-02_00\:00\:00\,nicstat.c nicstat.c $ ls -ogE nicstat.c -rw-r--r-- 1 14722 2012-08-24 11:22:48.819451726 -0700 nicstat.c $ hg diff $ hg commit nothing changed $ touch nicstat.c $ hg diff diff -r b49cf59d431d nicstat.c --- a/nicstat.c Fri Aug 24 11:21:27 2012 -0700 +++ b/nicstat.c Fri Aug 24 11:22:50 2012 -0700 @@ -2,7 +2,7 @@ * nicstat - print network traffic, Kb/s read and written. Solaris 8+. * "netstat -i" only gives a packet count, this program gives Kbytes. * - * 05-Jun-2005, ver 0.81 (check for new versions, http://www.brendangregg.com) + * 02-Jul-2005, ver 0.90 (check for new versions, http://www.brendangregg.com) * [...] Now, before you agree or disagree with me on whether this is a bug, I will also say that I believe it is a feature. Yes, I feel it is an acceptable shortcut because in "real" situations an edit to a file will change the modification time by at least one second (the resolution that hg diff or hg commit is looking for). The benefit of the shortcut is greatly improved performance of operations like "hg diff" and "hg status", particularly where your repository contains a lot of files. Why did I have no change in modification time? Well, my source file was generated by a script that I have written to convert SCCS change history to Mercurial commits. If my script can generate two revisions of a file within a second, and the files are the same size, then I run afoul of this shortcut. Solution - I will just change my script to apply the modification time from the SCCS history to the file prior to commit. A "touch -t " will do that easily.

    Read the article

  • Unable to reach files in subfolder with domain name in path in IIS 5.

    - by Chuck Conway
    In IIS 5 files in the url: http://acme.com/_cache/cache-www.acme.com/v3.css are not accessible. All files below "cache-www.acme.com" are unreachable. I've verified that the files exists. Permissions are not a problem. I've assigned "Everyone" to the files and give "Everyone" full rights. What I have determined is in IIS 5 if there is a domain in the folder path, IIS 5 gets confused... Other javascript files outside the directory comedown fine... Any thoughts?

    Read the article

  • What port should I use for my reverse proxy to Apache 2 from nginx?

    - by meder
    I have nginx setup as the defacto port 80. I want to setup django+mod_wsgi on Apache2. I'm worried if I leave Apache2 as 80 it will cause a conflict. Is it better to avoid the headache and change Apache to a different port? server { listen 80; server_name work.domain.org; access_log /www/work.domain.org/log/access.log; error_log /www/work.domain.org/log/error.log; location / { proxy_pass http://127.0.0.1:8080/; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Magic-Header "secret"; client_max_body_size 10m; } }

    Read the article

  • How can transfer zabbix item from different hosts and save item statistic

    - by Stepchik
    There are two server's (srv1 and srv2). Mysql server has been installed on which of them. Srv1 mysql contains database (db1). Zabbix-server get statistic throw configured agent user parameter (https://www.zabbix.com/documentation/2.0/manual/config/items/userparameters). Yesterday i has been copyed database db1 from mysql srv1 to mysql srv2. I can clone zabbix server item (https://www.zabbix.com/documentation/2.0/manual/config/items) to srv2, but lost all srv1 db1 statistic. Can you advice how keep them?

    Read the article

  • Troubleshooting a slow database server with no load

    - by user1721724
    I'm getting ready to soft launch my website and I've run into some problems with what I think is being caused by my MySQL database running on Fedora. All websites run fine, just as I'd expect, but any pages that establish a database connection hang until the connection is established, and then bang, the site loads as it should. Ex. My landing page (http://www.thrusong.com) doesn't make a database connection and loads quickly. User profile pages (http://www.thrusong.com/john) make a database connection and load slowly, even though most of the data comes from memcached and the database currently has no load on it. This problem just came up yesterday when my router died and I began using my Pace 2Wire modem with built-in router. Before, my old router was set to handle everything. My ISP says the settings in the modem are correct. Any ideas? Thanks in advance.

    Read the article

  • Cron prepending filename to script output

    - by Caitifty
    I'm having an issue with unwanted lines being added to files output by a cron job. I have a script in /etc/cron.hourly which selects some data from a mysql database and saves it in a text file in /var/www. When I run the script as root, it does exactly what I expect it to do. When the script is executed by cron, it creates the same file, but prepends the following three lines at the top of the output file: :::::::::::::: /var/www/outputfilename :::::::::::::: I can't for the life of me work out how to stop this unwanted behavior. The line in /etc/crontab for cron.hourly is the default "44 * * * * root cd / && run-parts --report /etc/cron.hourly". If I use su to change to being root and do "cd / && run-parts --report /etc/cron.hourly" the script runs as expected and the output doesn't have the mysterious additional 3 lines. I've also tried removing the --report flag from the run-parts command in case that was somehow connected, but no joy. Finally, perusing the cron log output in /var/log/syslog just says cron.hourly ran without giving any additional information. Any suggestions on solving this weird problem most welcome..

    Read the article

  • Apache Virtual Host Configuration

    - by Carl
    I have been searching the internet for an hour now, and I was hoping for a quick hint here so that I could solve my problem a wee bit faster. My virtual server is so far only accessible through an IP address, no DNS entry yet, and so far none needed either. The problem I have is with Apache2, the virtual hosts are puzzling me. What I need is: Access to my project (based on Symfony2) from the outside with the IP address Access to my project from localhost What I have got: Access from the outside results in rendering the websites in /var/www/vhosts/htdocs/default, while from the inside results in rendering the websites in /var/www. Why the difference? What is a recommended configuration for my use case?

    Read the article

  • LVM mirroring VS RAID1

    - by syrenity
    Hi. Having learned a bit about LVM mirroring, I thought about replacing the current RAID-1 scheme I'm using to gain some flexibility. Problem is that according to what I found on the Internet, LVM is: 1) Slower then RAID-1, at least in reading (as only single volume being used for reading). 2) Non-reliable on power interrupts, and requires disk cache disabling for prevention of data loss. http://www.joshbryan.com/blog/2008/01/02/lvm2-mirrors-vs-md-raid-1/ Also it seems, at least to several setup guides I read (http://www.tcpdump.com/kb/os/linux/lvm-mirroring/intro.html), that one actually requires a 3rd disk for storing the LVM log. This makes the setup completely unusable on 2 disks installations, and lowers the amount of used mirror disks on higher amount of disks. Can anyone comment the above facts, and let me know his experience of using LVM mirroring? Thanks.

    Read the article

  • mod_rewrite and SEO friendliness

    - by John Doe
    My website has an atypical structure and I'm not sure if this could create problems in the long run, specially for SEO positioning purposes. I have a unique, large PHP script, and I use the Apache module mod_rewrite in the .htaccess file to create friendly URLs, for example: RewriteRule ^$ /index.php?section=Main RewriteRule ^createArticle$ /index.php?section=Main&view=CreateArticle RewriteRule ^configuration$ /index.php?section=Configuration RewriteRule ^article/([0-9]{1,10})$ /index.php?section=Article&view=Default&id=$1 RewriteRule ^deleteArticle/([0-9]{1,10})$ /index.php?section=Article&view=Delete&id=$1 RewriteRule ^reportArticle/([0-9]{1,10})$ /index.php?section=Article&view=Report&id=$1 RewriteRule ^logIn$ /index.php?section=Authentication ... So, www.example.com/index.php?section=Article&view=Default&id=105 would become www.example.com/article/105. The only real physical file is index.php, in which the parameters of the URL queried is processed and the corresponding result is outputted. My question is, do the crawling robots (e.g. Googlebot) recognize these links? Do they index the resulting HTML outputted by index.php with the specified parameters as if it was a actual HTML file? Also, would this become a problem when creating a Sitemap?

    Read the article

  • PHP-FPM performing worse than mod_php

    - by lordstyx
    Recently the website I maintain has been growing a lot and I saw the point coming where I'd want to switch from apache to nginx, because I kept on reading that it performs way better. Now I've done the switch, and I have to say, nginx is keeping up just fine. However, php-fpm is forming a problem. Where the php pages used to take 0.1 second to generate with the same load they now take around 3 seconds! Furthermore the error.log from nginx is being spammed with errors like: upstream timed out (110: Connection timed out) while connecting to upstream, client: ... I also tried using unix sockets instead, but those would complain about the following: connect() to unix:/tmp/php5-fpm.sock failed (11: Resource temporarily unavailable) while connecting to upstream I've fiddled with settings here and there but nothing seems to work. Changing the amount of pm.max_children doesn't seem to help a lot either, but with it's current amount at 350 it seems to be the lesser of all evil. The server that's being used has 3 GB RAM (not all of it is free due to a MySQL server also running) along with 2 dual-core processors (4 cores in total). Am I doing something majorly wrong with the settings here, or is the server simply not capable enough? EDIT: Here is the nginx server block server { listen 80; listen [::]:80 default ipv6only=on; root /var/www; index index.php index.html index.htm; server_name localhost; location / { try_files $uri $uri/ /index.html; } location /doc/ { alias /usr/share/doc/; autoindex on; allow 127.0.0.1; deny all; } location = /50x.html { root /usr/share/nginx/www; } location ~ \.php$ { fastcgi_split_path_info ^(.+\.php)(/.+)$; # NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini try_files $uri = 404; # With php5-cgi alone: fastcgi_pass 127.0.0.1:9000; # With php5-fpm: #fastcgi_pass unix:/tmp/php5-fpm.sock; fastcgi_index index.php; include fastcgi_params; } location ~ /\.ht { deny all; } } And the php-fpm pool: [www] user = www-data group = www-data listen = 127.0.0.1:9000 ;listen = /tmp/php5-fpm.sock listen.backlog = -1 pm = dynamic pm.max_children = 350 pm.start_servers = 200 pm.min_spare_servers = 10 pm.max_spare_servers = 350 pm.max_requests = 1536 rlimit_files = 65536 rlimit_core = unlimited chdir = /

    Read the article

  • I want to start my portfolio site using ASP.Net and I'm a bit lost about hwo to actually put it on t

    - by Papuccino1
    I found this site: www.discountasp.net They seem cheap enough and have a track record. I decided to host my site with them. Here's where I'm confused. I host the application (my website) with them and they give me an IP address, right? Users can visit my site by typing in that IP address right? (Of course once I move the index file and create a defauly web folder, etc.) Next step is buying a domain name right? Like www.mysite.com, right? Is this the way it's done, or am I doing it wrong?

    Read the article

  • Best practice, or generally best way to set up web-hosting server, permissions, etc.

    - by Jagot
    Hi, I'm about to set up a server upon which a friend and I will be hosting web sites, and I'll be using Debian. I've set up a LAMP solution many times just to using for local testing purposes, but never for actual production use. I was wondering what are the best practices are in terms of setting the server up, in reference specifically to accessing the web root directory. A couple of the options I have seen: Set up a single user account on the server for us both to use and use a virtual host to point to the somewhere in the home directory, e.g. /home/webdev/www. Set each of us up a user account, and grant permissions in some way to /var/www (What would be the best way? Set up a new group?) I want to get this right when I first set this up as there won't be any going back for a while once our first site is up and running. Appreciate any guidance in advance.

    Read the article

  • Existing laravel 4 project gives 404 in browser

    - by Richard A
    I'm trying to set up a development environment on a virtual machine running Ubuntu 14.04 LTS using Nginx and HHVM. To do this, I followed the tutorial here. This goes well with a new installation of Laravel. But when I import an existing Laravel 4 project and try to open that on my actual machine (which will serve as the client running Windows 7), I'm getting a 404 File Not Found error on the screen while connecting to http://sav.savrichard.dev. I did add this to the hosts file with the correct IP Address. The virtual machine is receiving the request and responds with a 404 error. How do I solve this error? I'm pretty new to Ubuntu so I'm not exactly sure what's wrong. The project is located at /var/www/sav.savrichard.net The server configuration is as follow: server { listen 80 default_server; root /var/www/sav.savrichard.net/public; index index.html index.htm index.php; server_name sav.savrichard.dev; access_log /var/log/nginx/localhost.sav.savrichard.dev-access.log; error_log /var/log/nginx/localhost.sav.savrichard.dev-error.log error; charset utf-8; location / { try_files \$uri \$uri/ /index.php?\$query_string; } location = /favicon.ico { log_not_found off; access_log off; } location = /robots.txt { log_not_found off; access_log off; } error_page 404 /index.php; include hhvm.conf; # Deny .htaccess file access location ~ /\.ht { deny all; } } And the hhvm.conf file is: location ~ \.(hh|php)$ { fastcgi_keep_conn on; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; }

    Read the article

  • Rsync : execute permission required

    - by user651488
    I'm using rsync between two servers to transfer files. The problem is some files are not transferred. I get this error : rsync: readlink "/var/www/index.html" failed: Permission denied (13) So I check permissions on the server and after make tests, I notice a file is transferred only if it has these permissions : R-W ! If the file have these permissions : R--, Rsync can't download it !? Command: /usr/bin/rsync -avzr -e "/usr/bin/ssh -i /home/replication/thishost-rsync-key" [email protected]:/var/www/index.html ./ Is it a bug with Rsync ? I find any information about this problem. Thanks for your help Debian Etch 2.6.30 Rsync 2.6.9 protocol version 29

    Read the article

  • FTP gives me a error when uploading and deleting files [on hold]

    - by AR Games
    Here's the error I get when trying to delete files... Command: DELE index.html Response: 550 Delete operation failed. Here's the error I get when trying to upload files... Command: OPTS UTF8 ON Response: 200 Always in UTF8 mode. Status: Connected Status: Starting upload of C:\wamp\www\.DS_Store Command: CWD /var/www/html Response: 250 Directory successfully changed. Command: TYPE A Response: 200 Switching to ASCII mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,78,222). Command: STOR .DS_Store Response: 553 Could not create file. Error: Critical file transfer error Status: Retrieving directory listing... Command: TYPE I Response: 200 Switching to Binary mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,23,94). Command: LIST Response: 150 Here comes the directory listing. Response: 226 Directory send OK. Status: Directory listing successful Response: 421 Timeout. Error: Connection closed by server Status: Disconnected from server IM running windows OS and using filezilla FTP client

    Read the article

  • Apache: tmp is not writable

    - by Patrick
    hi, I've installed Drupal on a new webserver and I get the following errors: warning: is_writable() [function.is-writable]: open_basedir restriction in effect. File(/tmp) is not within the allowed path(s): (/customers/rollergirl.ch/rollergirl.ch:/var/www/diagnostics:/usr/share/php) in /customers/rollergirl.ch/rollergirl.ch/httpd.www/drupal/sites/all/modules/imagecache/imagecache.install on line 37. ImageCache Temp Directory /tmp is not writeable by the webserver. I guess this happen because the server is not configured with a writable tmp folder I don't have access to Apache configuration file (I only know for sure it is Apache). Could you suggest me what to do ? I can only contact web server service ? thanks

    Read the article

  • How to install PyQt on Mac OS X 10.6

    - by Albert
    I want to install PyQt. This seems kind of complicated to install on OS X. I haven't found any precompiled packages of it (are there any? I would really prefer those). So I downloaded PyQt. And SIP, because it depends on that. These files: http://www.riverbankcomputing.co.uk/static/Downloads/PyQt4/PyQt-mac-gpl-4.7.3.tar.gz http://www.riverbankcomputing.co.uk/static/Downloads/sip4/sip-4.10.2.tar.gz Did a python configure.py && make && sudo make install on SIP -- installed without any problems. Tried the same on PyQt -- and failed of course: /Library/Frameworks/QtCore.framework/Headers/qglobal.h:288:2: error: #error "You are building a 64-bit application, but using a 32-bit version of Qt. Check your build configuration." Ok, so I tried with python configure.py --use-arch=i386. Same error. Any idea?

    Read the article

  • MySQL/Apache: Replace spaces with underscores only in certain URLs

    - by javipas
    I'm having a problem with some images I'm using on my WordPress blog. After a migration I renamed every image replacing spaces with underscores, so HIDDEN_264_4062_FOTO_IDF los MID.jpg was renamed to HIDDEN_264_4062_FOTO_IDF_los_MID.jpg But althought the trick was necessary and worked for most of the posts, some of them try to find the old image, with spaces: This is not found http://www.example.com/files/HIDDEN_264_4062_FOTO_IDF%20los%20MID.jpg and this should be the right URL http://www.example.com/files/HIDDEN_264_4062_FOTO_IDF_los_MID.jpg Careful, though, 'cause the "%20" is only shown on the browser: the text on the database shows spaces, not "%20". I'd like to know if maybe I could make a SQL query in my WordPress MySQL database that replaces spaces in .jpg files with underscores. The path of the images is always the same, so the rule should transform this: /files/HIDDEN_264_4062_FOTO_IDF los MID.jpg /files/HIDDEN_264_4062_FOTO_IDF_los_MID.jpg the "/files/HIDDEN_264_" part is always the same, but the rest varies. Is some way to perform this? Maybe a rewrite rule on Apache (our current webserver)?

    Read the article

  • PASS Summit Preconference and Sessions

    - by Davide Mauri
    I’m very pleased to announce that I’ll be delivering a Pre-Conference at PASS Summit 2012. I’ll speak about Business Intelligence again (as I did in 2010) but this time I’ll focus only on Data Warehouse, since it’s big topic even alone. I’ll discuss not only what is a Data Warehouse, how it can be modeled and built, but also how it’s development can be approached using and Agile approach, bringing the experience I gathered in this field. Building the Agile Data Warehouse with SQL Server 2012 http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=2821 I’m sure you’ll like it, especially if you’re starting to create a BI Solution and you’re wondering what is a Data Warehouse, if it is still useful nowadays that everyone talks about Self-Service BI and In-Memory databases, and what’s the correct path to follow in order to have a successful project up and running. Beside this Preconference, I’ll also deliver a regular session, this time related to database administration, monitoring and tuning: DMVs: Power in Your Hands http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=3204 Here we’ll dive into the most useful DMVs, so that you’ll see how that can help in everyday management in order to discover, understand and optimze you SQL Server installation, from the server itself to the single query. See you there!!!!!

    Read the article

  • `wget` is not recognized or either can't find the file

    - by clankill3r
    if i use cd C:\Program Files (x86)\GnuWin32\bin then i'm able to use wget commands for example: wget http://www.ultralightnews.com/trikes/images/trikes/dfs-singletrike.jpg but i can't find the file back, i looked in c:\ and in the bin folder mentioned above and in GnuWin32\etc If i try wget -O C:\Users\clankill3r\Downloads\wgetfolder wget http://www.ultralightnews.com/trikes/images/trikes/dfs-singletrike.jpg then it says Permision denied, i did allow all permisions possible for every group / user. Some people say it downloads to the current folder your working in (that's why i looked in the bin). But i thought let's try to run the command from another folder so i used: 'cd C:\Users\clankill3r\Downloads\wgetfolderand then the wget comman but then it says thewget` command is not recognized. can someone help?

    Read the article

  • Sending large files - do any vendors sell their solution?

    - by Rob Nicholson
    We currently have an account with www.mailbigfile.com to allow us to send & receive files which exceed our client's email limits. In our industry, a 10MB limit is not unknown. Mailbigfile works fine for what it is but increasingly, our clients are starting to block it as a security risk. A solution would be for us to license the software and run it from our own web server which is far less likely to be blocked. Does anyone know of vendors in this market? We are looking at web collaboration systems but that's a much bigger project. The technology behind www.mailbigfile.com isn't that complex (http upload, email notification and then http download) so I'm hoping it won't be very expensive. Cheers, Rob.

    Read the article

  • Using a Custom Domain Name In Place of etsy

    - by Ngu Soon Hui
    I am thinking about creating an online shop at etsy, the one requirement I have is, I want user to see my domain name (www.myname.com), instead of myname.etsy.com. Given that I don't have access to the etsy server, is there thing I can do on my domain redirect( assuming I am using apache), so that whatever request on www.myname.com will be translated to the etsy side? This is so that whoever comes to my website won't see the word etsy in the url. Another particular thing is that I want my custom domain name to show in the web browser location bar when the redirect completes. Is there anyway to do this with apache?

    Read the article

  • transfer code from one server to other server.

    - by Kamlesh Bhure
    I wanted to transfer new code into my new production server. I have code files on my development server. Instead of uploading files using FTP from my local machine, there is other way to transfer code from one server to other. What I am thinking I will make zip file of whole code to be transfer and place it in webroot. So that it would be accessible in internet with some link http://www.mydomain.com/code.tar.gz now on the other server i will just run command wget http://www.mydomain.com/code.tar.gz Will this transfer done in few seconds...? May I know is this correct approach?

    Read the article

  • Where to find a list of online TV/video/Webcam sources ?

    - by Frank
    I know there are lots of web sites that offer online TV/Stream viewing, such as : http://tvunetworks.com , http://www.hulu.com/ and more, but the source of their streams are usually well hidden, I wonder if there is any open source project that collects the online TV/video/Webcam sources so that TV stations and individuals can publicly list their stream source in the following format, you can copy the urls below into a browser and start watching : Greek TV|mms://eu02.egihosting.com/938657?MSWMExt=.asf Turkish TV|http://www.bizidinle.com/player/SAlone.asp?id=7 Even if there is no public open source project, is there any where that I can find such a list so that I can get to the stream urls ?

    Read the article

< Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >