Search Results

Search found 12803 results on 513 pages for 'lucene index'.

Page 318/513 | < Previous Page | 314 315 316 317 318 319 320 321 322 323 324 325  | Next Page >

  • Web based interface for open SSL client certificates

    - by Felix
    Hi there! We are currently developing a apache2-based web application and want to invite some beta testers to give it a try. To be on the safe side, access should be provided by individual browser certificates (.p12) which are issued using a (fake) CA. Our users should be passing a complete register/login process and some of them will be granted administrative privileges within the application. That's why a preceding simple web-based authentication won't be sufficient. Atm, I am using a serverside shellscript to generate the certificates each time. Do you know about a small, web-based tool to simplify the process of generating / revoking those certificates? Maybe an overview of the CA's index.txt plus the option to revoke a cert and a link to download them directly?

    Read the article

  • *.example.com wildcard domain can be parsed from a single page?

    - by Sean Kean
    For a domain 'example.com' - what is the easiest way to set up a wildcard dns (*.example.com), hosting, and htaccess/httpd.conf/virtualhost, and script on a page so that: how.do.i.setup.a.site.with.wildcards.like.this.example.com or anything.that.is.given.as.a.subdomain.for.example.com is rendered by a page at example.com/index.html - yet keeps the wildcard subdomain in the URL bar and passes the full URL as a parameter for rendering tags in HTML? An example tag is a Facebook comment: { div class="fb-comments" data-href="http://how.do.i.setup.a.site.with.wildcards.like.this.example.com" data-num-posts="2" data-width="500" } I just opened a hosting account with spry.com and have a VPS running Ubuntu 11.04-x86-LAMP - Essentially, what is the most straightforward way of doing this? Thanks so much. (I originally posted this over on stackoverflow but realize its more of a serverfault question)

    Read the article

  • How much HDD space would I need to cache the web while respecting robot.txts?

    - by Koning Baard XIV
    I want to experiment with creating a web crawler. I'll start with indexing a few medium sized website like Stack Overflow or Smashing Magazine. If it works, I'd like to start crawling the entire web. I'll respect robot.txts. I save all html, pdf, word, excel, powerpoint, keynote, etc... documents (not exes, dmgs etc, just documents) in a MySQL DB. Next to that, I'll have a second table containing all restults and descriptions, and a table with words and on what page to find those words (aka an index). How much HDD space do you think I need to save all the pages? Is it as low as 1 TB or is it about 10 TB, 20? Maybe 30? 1000? Thanks

    Read the article

  • phpbb behind a reverse proxy

    - by asciitaxi
    Hi, i've got a django app running on apache behind an nginx reverse proxy. Nginx takes requests on port 80 and forwards them to apache on 127.0.0.1:81. This works fine. Now I want to run phpbb on apache under /forums. My problem is that when phpbb does a redirect, it seems to redirect to the internal apache port, rather than port 80. So, for instance when I first go to http://my-dev-server/forums to configure php bb, it immediately redirects to http://127.0.0.1:81/forums/install/index.php. Is there something I need to do in nginx/apache/phpbb config to get it to redirect to the external port? Thanks very much!

    Read the article

  • Rewrite rule to redirect all subpages to a single page?

    - by user784637
    I have two two files /etc/apache2/sites-available/foo and /etc/apache2/sites-available/foo_maintenance The rewrite rule I use in /etc/apache2/sites-available/foo is <Directory /var/www/public_html> Options +FollowSymlinks RewriteOptions inherit RewriteEngine on # RewriteCond %{HTTP_HOST} ^mysite\.com [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] </Directory> so that all mysite.com/* redirect to www.mysite.com After I take my site down for maintenance, if the user is navigates to a subpage of the site like mysite.com/subdir/something.php I would like to redirect them to www.mysite.com so the index.html of the maintenance page would be displayed. What is the rewrite rule to redirect all traffic from any subpage to www.mysite.com?

    Read the article

  • a load balancing scenario using HAProxy and keepalived shows no performance advantage

    - by chakoshi
    Hi, I am trying to setup a load balanced web server scenario, using two HAproxy load balancers and two debian web servers following this guide http://www.howtoforge.com/setting-up-a-high-availability-load-balancer-with-haproxy-keepalived-on-debian-lenny. the setup is working but the results of simple performance benchmarking is not what I expected. I tried apache benchmark tool to send lots of requests to servers (one time directly testing one of the web servers and the other time testing through the load balancer) using the command "ab -n 1000000 -c 500 http://IP/index.html", but the test results shows better performance for the single server without load balancer. can any one tell me if I'm going wrong on some thing?

    Read the article

  • My Graphics Card isn't working Properly

    - by Dan
    I have just upgraded from a Sapphire AMD 6670 2gb Graphics card to a Nvidia GTX 650 ti 2gb SSC and my Windows Experiance index has gone from 6.8 to 7.7 but when playing games i am seeing no improvements i cannot play saints row three on even the lowest settings but according to many people and benchmarks on the web i should be able to play it comfortably. I want to know why this is happening to me..... I have installed the latest drivers and i have direct 10 + 11 installed I am 15 and it's my birthday today as i got it as a present but its not doing what i want also in am using a dvi cable not hdmi because i need a new one and its in the post. Is it possible that using dvi will affect performance

    Read the article

  • mod_rewrite RewriteRule is not working

    - by buggy1985
    Hi, This is a follow-up of this question: Rewrite URL - how to get the hostname and the path? And a copy of this: mod_rewrite RewriteRule is not working I got this Rewrite Rule: RewriteEngine On RewriteRule ^(http://[-A-Za-z0-9+&@#/%=~_|!:,.;]*)/([-A-Za-z0-9+&@#/%=~_|!:,.;]*)\?([A-Za-z0-9+&@#/%=~_|!:,.;]*)$ http://http://www.xmldomain.com/bla/$2?$3&rtype=xslt&xsl=$1/$2.xsl it seems to be correct, and exactly what I need. But it doesn't work on my server. I get a 404 page not found error. mod_rewrite is enabled, as the following simple rule is working fine: RewriteEngine On RewriteRule ^page/([^/\.]+)/?$ index.php?page=$1 [L] Can you help? Thanks

    Read the article

  • Apache not directing to correct VHost

    - by BANANENMANNFRAU
    I have setup the following virtual host ServerAdmin [email protected] ServerName mysite.com ServerAlias www.mysite.com DocumentRoot /var/www/homepage/public_html ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined When I hit my url Apache still shows the default page. Not the index Ive created in the give Document root. In my Domain i have set the A Record to the Ip of my VPS: apache2ctl -S: output: VirtualHost configuration: *:80 is a NameVirtualHost default server xxxxxx.stratoserver.net (/etc/apache2/sites-enabled/000-default.conf:1) port 80 namevhost xxxxxxx.stratoserver.net (/etc/apache2/sites-enabled/000-default.conf:1) port 80 namevhost mysite.com (/etc/apache2/sites-enabled/homepage.conf:1) alias www.mysite.com ServerRoot: "/etc/apache2" Main DocumentRoot: "/var/www" Main ErrorLog: "/var/log/apache2/error.log" Mutex default: dir="/var/lock/apache2" mechanism=fcntl Mutex mpm-accept: using_defaults Mutex watchdog-callback: using_defaults PidFile: "/var/run/apache2/apache2.pid" Define: DUMP_VHOSTS Define: DUMP_RUN_CFG User: name="www-data" id=33 not_used Group: name="www-data" id=33 not_used How would I need to setup my Virtual host so that apache shows the correct site depending on the Domain im redirecting from.

    Read the article

  • Newly added virtualhost not working, domain points to /var/www/

    - by Morgan
    I've had no problem with vhosts before, but for some reason this one isn't pointing to the right document root. The domain is pointing to the correct IP, apache sees no errors with the config file in sites-available, yet it just isn't pointing correctly. Here is the vhost config for the domain: <VirtualHost *80> ServerAdmin [email protected] ServerName mydomain.info ServerAlias www.mydomain.info DirectoryIndex index.html DocumentRoot /var/www/vhosts/mydomain.info/htdocs LogLevel warn ErrorLog /var/www/vhosts/mydomain.info/log/error.log CustomLog /var/www/vhosts/mydomain.info/log/access.log combined </VirtualHost> For the record, I am running Apache2 on Ubuntu 12.10

    Read the article

  • XAMPP: Access Forbidden!

    - by Yar
    I just installed a fresh XAMPP on OSX. Apache runs and I can see the splash page. I open the httpd.conf and I set both places that point to htdocs to someplace else, which results in Apache showing an "Access Forbidden!" message. I plugged my directory here: <Directory "/Applications/XAMPP/xamppfiles/htdocs"> and here: DocumentRoot "/Applications/XAMPP/xamppfiles/htdocs" Most files have permissions like -rw-r--r--, but even if I set the index.php using chmod 777 nothing changes. Strangely, I just did this whole thing with MAMP and had no problems serving that directory, but it was slow.

    Read the article

  • Problems installing icinga-web

    - by Kungurov
    I'm using Ubuntu 10.04 LTS (64bit, Server), Apache 2.2.14 Following the instruction from the oficial icinga page http://docs.icinga.org/latest/en/index.html I installed the icinga-web-1.7.1 on my machine and configured a few hosts for test purposes. The Classic Interface runs as expected but the new Web Interface does not show any data. When I try: ps aux | grep ido2db | grep -v grep I get: icinga 27425 0.0 0.0 41464 600 ? Ss Jul27 0:00 /usr/local/icinga/bin/ido2db -c /usr/local/icinga/etc/ido2db.cfg which might indicate a problem with idomod/ido2db because according to the docs there should be at least 2 processes greped. Any ideas how to fix that?

    Read the article

  • Using wget to recursively download whole FTP directories

    - by user9406
    I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder. I tried running the following command form my new server: wget -r ftp://username:[email protected] But all I get is a made up index.html file. What the right syntax for using wget recursively over FTP?

    Read the article

  • WGet or cURL: Mirror Site from http://site.com And No Internal Access

    - by alharaka
    I have tried wget -m wget -r and a whole bunch of variations. I am getting some of the images on http://site.com, one of the scripts, and none of the CSS, even with the fscking -p parameter. The only HTML page is index.html and there are several more referenced, so I am at a loss. curlmirror.pl on the cURL developers website does not seem to get the job done either. Is there something I am missing? I have tried different levels of recursion with only this URL, but I get the feeling I am missing something. Long story short, some school allows its students to submit web projects, but they want to know how they can collect everything for the instructor who will grade it, instead of him going to all the externally hsoted sites.

    Read the article

  • How do I make zeitgeist work in Arch?

    - by wleoncio
    I've been trying to setup Zeitgeist on my Gnome-shell system for a couple of days, but I'm yet to get it to work. I've done everything I could think of, i.e. installing zeitgeist from [extra], as well as libqzeitgeist. I've also installed all Gnome extensions created by Seif (https://extensions.gnome.org/accounts/profile/seif), since they're the reason I'm installing the package in the first place. I've tried running "zeitgeist-daemon --replace" and then "gnome-shell --replace", but nothing seems to work. According to Der Harm's wiki (https://wiki.archlinux.org/index.php/User:Der_harm#Gnome_Zeitgeist), the Zeitgeist daemon doesn't need to be explicitly started, but even if it was, I don't know how to do it (since it's not in /etc/rc.d, I bet adding "zeitgeist" to my rc.conf wouldn't do any good either). I can't believe there isn't a very simple setup here, please help me see what I'm missing!

    Read the article

  • Clean URLS with mod rewrite and URL Encoded characters causes 404?

    - by Richard JP Le Guen
    I have a web site using mod_rewrite to get some clean urls and custom 404 pages. My .htaccess file looks like this: <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?clean_url=$1 [QSA,L] </IfModule> What puzzles me is that if the URL contains a %2F (url-encoded /) the server seems to force a 404. As an example, http://example.com/category/article would be a normal article, but then http://example.com/category%2farticle gives a server-generated 404 page. (not the custom 404 page) I wouldn't have expected this... why this is happening? Is there a way around it?

    Read the article

  • How can I remove OLD history from Google Chrome?

    - by Norman Ramsey
    I'm working on a laptop with a modest hard drive, and 500MB is taken up with Google Chrome "History Index" and "Thumbnails" files. Some of these files are a year old. Chrome offers me the option to remove recent history, but I want the opposite: I want to remove old history. (Ideally I would remove the least recently used history information, but I don't expect to be able to do that.) Anyone have any ideas? I'm running the standard Debian google-chrome-beta package.

    Read the article

  • How do I remove a URL from Google without having to have a Google E-mail Account

    - by PP
    Really simple question. I do not want a Google account. I just want Google to stop making requests every 2 minutes for a URL it should never have known about (apparently Google harvests URLs from search requests as well as private e-mails, not just from actual web pages). But when I search Google help for removing URLs it appears I have to use their "webmaster tools" which require logging into a GMail account! How do I tell Google not to index my URL without becoming a customer? Note: I already return 404 for the URLs in question using a rewrite rule - this appears to make zero difference to the crawler which continually attempts to fetch the page every 2 minutes.

    Read the article

  • Making python run on my webserver

    - by richzilla
    Hi all, im getting a bit stuck regarding options for running python scripts on my server. From the research ive done so far, i can see i need to modify apache slightly to run python scripts, by using either mod_wsgi or mod_python. Two issues i have: mod_python doesnt appear to be maintained anymore (last release, 2007) mod_wsgi appears to require modification of my httpd.conf file on a per application basis. What im wanting to know, is there a way of getting python scripts to run in the same way as php, i.e. just by going to index.py etc... or is it more involved than that? At present im just trying to set it up on my xampp install. Any help would be appreciated.

    Read the article

  • I want to start my portfolio site using ASP.Net and I'm a bit lost about hwo to actually put it on t

    - by Papuccino1
    I found this site: www.discountasp.net They seem cheap enough and have a track record. I decided to host my site with them. Here's where I'm confused. I host the application (my website) with them and they give me an IP address, right? Users can visit my site by typing in that IP address right? (Of course once I move the index file and create a defauly web folder, etc.) Next step is buying a domain name right? Like www.mysite.com, right? Is this the way it's done, or am I doing it wrong?

    Read the article

  • Warning in Apache log: Cannot get media type from 'x-mapp-php5'

    - by IronGoofy
    I have no idea what is causing this issue, but it seems to be related to the displayed file (just a simple index.php to print phpinfo) being in an aliased directory. Any suggestions what I can do to avoid the warning? Here's an excerpt from my httpd.conf: <Directory "<dir with broken php>"> Options Indexes FollowSymLinks ExecCGI Includes AllowOverride All Order allow,deny Allow from all </Directory> Alias /smartersoftware/ "<broken dir>" <FilesMatch \.php$> SetHandler application/x-httpd-php </FilesMatch> The last three lines were required to make php work at all (which I found a bit strange, and it may or may not be related to my problem). Adding a AddType application/x-mapp-php5 .php didn't change anything.

    Read the article

  • how to set auto redirection in tomcat

    - by Registered User
    I have a site http://social.openitup.in right now what you are seeing is a default Tomcat6 page. I am using mod_ajp as a front end and Apache vhost configuration for same is <VirtualHost *:80 > ServerName social.openitup.in ServerAdmin webmaster@localhost ProxyRequests off <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPreserveHost On ProxyPass / ajp://192.168.1.19:8009/ ProxyPassReverse / ajp://192.168.1.19:8009/ </VirtualHost> How ever I have an application running on it http://social.openitup.in/olat what I want to do is when some one opens http://social.openitup.in then rather than seeing Tomcat6 home page from /var/lib/tomcat6/webapps/ROOT/index.html the person is redirected to olat application which is in /var/lib/tomcat6/webapps/olat how can this be achived? The above vhost configuration is on a machine separate than where OLAT is running.

    Read the article

  • Odd Suhosin memory alerts

    - by slice
    I am getting a lot of odd suhosin alerts in my syslog. The following are example entries: Jun 9 08:46:11 suhosin[9764]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker '157.55.39.180', file '/var/www/site/index.php') Jun 9 08:46:11 suhosin[9744]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker '109.74.2.136', file '/var/www/site/test.php') Jun 9 08:46:13 suhosin[9779]: ALERT - script tried to increase memory_limit to 0 bytes which is above the allowed value (attacker 'REMOTE_ADDR not set', file 'unknown') Jun 9 08:46:13 suhosin[9779]: ALERT - script tried to increase memory_limit to 2145386496 bytes which is above the allowed value (attacker 'REMOTE_ADDR not set', file 'unknown') What is happening here? Why 0 bytes or 2145386496 bytes (2046 GB!!??)? Why does it sometimes state the attacker and the requested script and sometimes state 'REMOTE_ADDR not set' and file 'unknown'? How do I proceed to figure this out?

    Read the article

  • How to report a malicious site to Google, Microsoft, Mozilla, etc. so that they will warn users

    - by Jayapal Chandran
    I completed a project a year ago. Now a few modification were needed. While trying to test the site, there was an index.html file with a malicious script which had an iframe to another site's jar file. Kaspersky antivirus blocked it. I browsed via ftp to find the file and I deleted it. I also disabled directory listing. Maybe the ftp details of the site owner would have been hacked. I want to report this site to Google, Microsoft, Mozilla, and other antivirus providers. How do I do that? I hope kaspersky would have updated it in their database, but I still want to explicitly report this. Here is the popup kaspersky showed:

    Read the article

  • Nvidia 9 series or Intel HD 2000? [closed]

    - by EApubs
    I just tested an Nvidia 9300 GS card with a Intel Corei3 HD 2000 graphics system. Here is the windows experience index scores I got : Nvidia 9300 GS : Base Score 3.9 Processor : 7.1 Memory : 7.5 Graphics : 3.9 Gaming Graphics : 5.1 Hard Disk : 5.9 Intel HD 2000 : Base Score : 5.2 Processor : 7.1 Memory : 5.9 Graphics : 5.2 Gaming Graphics : 5.8 Hard Disk : 5.9 My questions are : When using Intel HD graphics, it reduces the score of my Ram! How is that possible? It checks the speed of the ram. Not the size (i think). Intel graphics take some of the ram space but how can that effect the speed? From both of them, what will be the good choice?

    Read the article

< Previous Page | 314 315 316 317 318 319 320 321 322 323 324 325  | Next Page >