Search Results

Search found 21241 results on 850 pages for 'www sudeep'.

Page 110/850 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • SQL Bits 7 - 30th September - 2nd October 2010 in York

    In case you haven't heard we are planning the next SQL Bits event, and today we have released the agenda for Friday & Saturday, a total of 50 sessions covering all aspects of SQL Server with a great selection of speakers. http://www.sqlbits.com/information/Agenda.aspx From our recent announcement - ...SQLBits 7 will take place over three days from Thursday September 30th to Saturday October 2nd in York. Day one will be a training day, featuring in-depth full day seminars by leading SQL Server professionals such as Chris Testa-O’Neill and Chris Webb (see http://www.sqlbits.com/information/TrainingDay.aspx for more details); day two will be a deep-dive conference day with advanced sessions delivered by the best speakers from the SQL Server community; and day three will be the traditional SQLBits community conference day, with a wide range of sessions covered all aspects of SQL Server at all levels of ability. There will be a charge to attend days one and two, but day three, Saturday October 2nd, will as usual be completely free to attend allowing everyone to attend and experience a great day of training even if they have no training budget. Full details available at http://www.sqlbits.com.

    Read the article

  • SSL certificates and whether a wildcard common name will support domain.com

    - by timpone
    Sorry, if this is very vendor specific but I purchased an inexpensive SSL Cert from GoDaddy. Right now everything on production is hosted off of www.domain.com. When specifying the common name would a wildcard (ie *.domain.com) cover the case of a lack of a third-level domain such as domain.com? Just to be sure, I made it for www.domain.com rather than a wildcard. If it matters, I will be using with nginx and a mod_passenger. If I want to cover everything including domain.com and staging.domain.com, www.domain.com etc, would a wildcard be the proper cert? Does the inexpensive godaddy cert (12.99 / year) cover wildcard certs (it didn't seem to for me)? Again, sorry for asking vendor specific questions and thx in advance. thx

    Read the article

  • apache domain names are case sensitive

    - by neubert
    The following HTTP request results in a "See the error log for more details; Invalid Value Found For Domain" error: GET / HTTP/1.0 Host: www.MyWebsite.com If I make the hostname all lowercase, however, it works just fine. How can I make Apache case insensitive? Here's my httpd.conf file: <VirtualHost *:80> ServerName mywebsite.com ServerAlias www.mywebsite.com ... </VirtualHost> I tried adding ServerAlias www.MyWebsite.com to that but that didn't help. And in any event, it seems like that's a poor approach anyway since the case can be mixed up in a ton of different ways and trying to account for all of them would result in a huge *.conf file. Any ideas? Thanks!

    Read the article

  • Only allow the POST method for a specific file in a directory

    - by Dave Chen
    I have one file that should only be accessible via the POST method. /var/www/folder/index.php The document root is /var/www/ and index.php is nested inside a folder. My configurations are as follows: <Directory "/var/www/folder"> <Files "index.php"> order deny,allow Allow from all <LimitExcept POST> Deny from all </LimitExcept> </Files> </Directory> I visit my server at 127.0.0.1/folder but I can GET and POST the file just like normal. I've also tried reversing the order, order allow,deny, require, limitexcept and limit. How can I only allow POST requests to be processed by one file in a folder?

    Read the article

  • Need help with an .htaccess URL redirector

    - by AlexV
    I'm trying to do another SEO system with PHP/.htaccess... I need the following rules to apply: Must catch all URLs that do not end with an extension (www.foo.com -- catch | www.foo.com/catch-me -- catch | www.foo.com/dont-catch.me -- don't catch). Must catch all URLs that end with .php* (.php, .php4...) (thwaw are the exceptions to rule #1). All rules must only apply in some directories and not in their subdirectories (/ and /framework so far). The htaccess must send the typed URL in a GET value so I can work with it in PHP. Any mod-rewrite wizard can help me?

    Read the article

  • Excel - working in a bank

    - by Einsteins Grandson
    I am supposed to go to an interview to a bank for just supporting managers in projects. It's a part-time job and the thing is that bank uses Excel for everything. Modifications of tables of really lot of data... What can I expect to find in the test of Excel? I have some books that are around 1000 pages thick but I don't have time and also don't feel like reading everything that's in them. These are the books that I have: http://www.amazon.com/Excel-2010-Bible-John-Walkenbach/dp/0470474874/ref=sr_1_1?ie=UTF8&qid=1347571864&sr=8-1&keywords=excel+bible http://www.amazon.com/Excel-2010-The-Missing-Manual/dp/1449382355/ref=sr_1_1?ie=UTF8&qid=1347571884&sr=8-1&keywords=Excel+2010+The+Missing+Manual http://www.amazon.com/Microsoft-Excel-2010-In-Depth/dp/0789743086/ref=sr_1_1?ie=UTF8&qid=1347571904&sr=8-1&keywords=Microsoft+Excel+2010+In+Depth So, anybody knows a good online tutorial or a book that would contain the basics and was not that much thick? ;-) Thanks so much!!!

    Read the article

  • Host forwarding fails, server is up, domain name tests ambiguous

    - by jayunit100
    I have a domain name registered with http://www.registryrocket.com/ The "main" site, which is called rudolfcode.net, is registered under godaddy, and forwards to a heroku site (rudolfcode.herokuapp.com). I have found that the main site, rudolfcode.net works, but the hostgator forwarding has stopped working (firefox simply fails when you point to http://www.rudolflabs.com, which is the domain name registered by hostgator). How can I debug this issue ? Finally, I have tried to run some DNS tests, and here are the results : Im not sure what the failures mean .... But Im pretty sure that "Conecting to WWW Home Page" failed is a pretty bad sign ! Thanks.

    Read the article

  • Jet Brains release WebStorm 5.0

    - by TATWORTH
    At http://www.jetbrains.com/webstorm/whatsnew/index.html?WS50ROW, Jet Brains have announced the release of WebStorm 5.0, an IDE that brings the ease of code writing in VB.NET and C# that you get with ReSharper, to JavaScript, CSS and LESS. (There are some more details in http://blog.jetbrains.com/webide/2012/08/liveedit-plugin-features-in-detail/)Code completion in JavaScript, CSS and LESS is a very welcome feature. I look forward to trying out Web Storm. The download at http://www.jetbrains.com/webstorm/download/index.html comes with a free 30-day trial).Price information is at http://www.jetbrains.com/webstorm/buy/index.jsp - you should note that if you are an open-source developer, you can apply for a free license. The price of a personal license at £23 + VAT is a no-brainer. The price of a Commercial license would have been paid for in a few days of the increased productivity that this tool brings.Web Storm currently requires Google Chrome to run. Like ReSharper it appears to be a very able tool. It includes tools such as:XSLT debuggingJSLint for checking for JavaScript errorsJavaScript debuggingJavaScript unit testing (including code coverage)JavaScript folding regionsCoffeeScript supportWell I suggest that you try WebStorm 5.0

    Read the article

  • Best way to use mod_rewrite to replace WordPress pages with static files

    - by David Moles
    Here's the situation: I've got an old WordPress installation that I'd like to archive as static files, but I'd also like to preserve old URLs. I've already created the static archive with wget and sorted out the filenames and links. Now I'd like to configure Apache to intercept requests for the old dynamic URL and replace them with the new static one, e.g.: http://www.example.org/log/?p=1234 or http://www.example.org/log/index.php?p=1234 should redirect to http://www.example.org/log/archives/1234.html I've tried adding the following to the VirtualHost config for example.org, but to no effect -- I just get the PHP page. RewriteCond %{REQUEST_URI} /log/ RewriteCond %{QUERY_STRING} p=([^&;]*) RewriteRule ^/$ http://%{SERVER_NAME}/log/archives/%1.html [R,L] I've enabled logging and I can see what look like other rules being applied, but not this one. None of my other guesses at match patterns for %{REQUEST_URI} seem to have any effect either (log, log/, log.*, even .*). I'm new to mod_rewrite and this is mostly cargo cult, so I'm pretty sure I've gotten it wrong. Anyone know what I should be doing here?

    Read the article

  • Apache Default/Catch-All Virtual Host?

    - by SJaguar13
    If I have 3 domains, domain1.com, domain2.com, and domain3.com, is it possible to set up a default virtual host to domains not listed? For example, if I would have: <VirtualHost 192.168.1.2 204.255.176.199> DocumentRoot /www/docs/domain1 ServerName domain1 ServerAlias host </VirtualHost> <VirtualHost 192.168.1.2 204.255.176.199> DocumentRoot /www/docs/domain2 ServerName domain2 ServerAlias host </VirtualHost> <VirtualHost 192.168.1.2 204.255.176.199> DocumentRoot /www/docs/everythingelse ServerName * ServerAlias host </VirtualHost> If you register a domain and point it to my server, it would default to everythingelse showing the same as domain3. Is that possible?

    Read the article

  • Lighttpd referer issue

    - by Chris
    I have a problem to block files from accessing from different domains as my one. I have added to my lighty config in the "virual host" following: $HTTP["referer"] !~ "^($|http://www\.my-site\.net)" { url.access-deny = ( "" ) } but anyway the site www.example.com can access http://player.my-site.net/player.swf, also it can be accessed directly without a referrer. any idea? //EDIT here is my old apache .htaccess with a rewrite rule thats works perfect, but i dont know how to convert it for lighty: RewriteEngine on RewriteBase / RewriteCond %{HTTP_REFERER} !^http://my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://www\.my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://player\.my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://stream\.my-site\.net/ [NC] RewriteRule .* - [L,R=404]

    Read the article

  • HTTPS and HTTP issue on server with SSL

    - by Asghar
    I have a site www.example.com for which i purchased SSL cert and installed. And it was working fine, I also have a subdomain with app.example.com which was not on SSL. Both www.example.com and app.example.com are on same IP address. At later we decided to put SSL only on app.frostbox.com and then i configured SSL with app.frostbox.com and it worked fine, Now the issue is that Google is indexing my site as https://www.example.com/ and when users hits the web , Invalid security warning is issued and when user allow security issue they are shown my app.example.com contents. Note: I have my SSL configuration files in /etc/httpd/conf.d/ssl.conf The contents of the ssl.conf are below. NOTE: I tried solutions in .httaccess but none of those worked. Like redirecting 301 redirects etc http://pastebin.com/GCWhpQJq

    Read the article

  • Multilingual sites and Google search results, do subfolders really work?

    - by AWinter
    About three months ago we added an English version of our, previously Japanese only, site http://www.clubberia.com under the subfolder http://www.clubberia.com/en/ we've tried to follow the sometimes incomplete best practices laid out by Google by adding alternate tags to all pages that are currently translated. The top page for instance has the following meta tags for language. <link rel="canonical" href="/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> While the English main page under /en/ has <link rel="canonical" href="/en/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> We also have these alternate languages setup in the sitemap. (as per Google's recommendations) http://www.clubberia.com/sitemap.xml It seems however that Google absolutely refuses to show the English top page in results when the user is using English at google.com if you search for "clubberia" you'll, as of this post, get the Japanese description and a title that Google has apparently invented instead of the title and description in the meta-tags for the /en/ index page. Does anyone have any experience with subfolders actually working to affect search results? Am I being too impatient, or possibly doing something incorrect? Should we just give up on subfolders and push to subdomains (not the prettiest option)?

    Read the article

  • Cat all files in a directory, with a specific file at the beginning an end...?

    - by Aeisor
    Is there a way to cat all files in a given directory, but with a particular file at the beginning and end? For example, say I have: file1.js, file2.js, file3.js, file4.js, file5.js -- Effectively I would like to cat file2.js file*.js file3.js > /var/www/output.js I've tried a few variations of these find ! -name "file2.js" ! -name "file3.js" -type f -exec cat file2.js {} file3.js > /var/www/js/output.js \; find ! -name "file2.js" ! -name "file3.js" -type f | xargs -I files cat file2.js files file3.js > /var/www/output.js but the best I can get out of it is file2.js added before and file3.js added after all other files (multiple times) I know I could specify the files in the order I wanted manually, but this is not maintainable (I'm expecting, potentially 100 files). I have looked through man cat, as well as a handful of websites devoted to xargs, find and cat to no avail. Thanks in advance.

    Read the article

  • What kind of proxy acl rules should be applied?

    - by user42891
    I try to block sites in squid based on this article. Assuming you would want to block access to Yahoo (e.g http://www.yahoo.co.jp, http://www.yahoo.com, http://www.yahoo.co.in), you would ideally want to block all of the above URLs, if I use a regular expression and try to search something called yahoo it seems to get blocked. We are just interested in applying rules which would be most commonly used across all companies (e.g. social networking sites like facebook, orkut), porn sites (e.g. sex), gaming sites (games), movie & song download sites, and sites where they can upload data (e.g. rapidshare) What would be the common set of effective rules in achieving the above?

    Read the article

  • How to set permalink of your blog post according to date and title of post?

    - by Amit
    I am having this website http://www.finalyearondesk.com . My blogs post link are set like this.. http://www.finalyearondesk.com/index.php?id=28 . I want it to set like this ... finalyearondesk.com/2011/09/22/how-to-recover-ubuntu-after-it-is-crashed/ . I am using the following function to get these posts... function get_content($id = '') { if($id != ""): $id = mysql_real_escape_string($id); $sql = "SELECT * from blog WHERE id = '$id'"; $return = '<p><a href="http://www.finalyearondesk.com/">Go back to Home page</a></p>'; echo $return; else: $sql = "select * from blog ORDER BY id DESC"; endif; $res = mysql_query($sql) or die(mysql_error()); if(mysql_num_rows($res) != 0): while($row = mysql_fetch_assoc($res)) { echo '<h1><a href="index.php?id=' . $row['id'] . '">' . $row['title'] . '</a></h1>'; echo '<p>' . "By: " . '<font color="orange">' . $row['author'] . '</font>' . ", Posted on: " . $row['date'] . '<p>'; echo '<p>' . $row['body'] . '</p><br />'; } else: echo '<p>We are really very sorry, this page does not exist!</p>'; endif; } Any suggestions how to do this? And can we do this by using .htaccess?

    Read the article

  • Redirecting Subdomain with Volusion and GoDaddy

    - by ToddN
    I need to create a subdomain (reviews.basequipment.com) and have it re-direct to our Reviews page at http://www.resellerratings.com/store/survey/Burkett_Restaurant_Equipment_Supplies. The genius's at Volusion have never gotten this to work correctly and I have been asking them for years now (yes years). Our nameservers are on Volusion and we are hosted through GoDaddy. Volusion tells me to use their "CPanel" to update DNS Records to this: reviews | CNAME | www.resellerratings.com This doesn't even work when going to reviews.basequipment.com and of course does not do the full job as I want to go to http://www.resellerratings.com/store/survey/Burkett_Restaurant_Equipment_Supplies So I have then tried doing a Forward Subdomain in GoDaddy. Added subdomain "reviews" and forward to the link I mentioned above, with no avail (im assuming because my nameservers reside on Volusion). Has anyone had experience doing this, or does anyone know of any suggestions? Volusion is pretty limited with this and as I've said, I have been trying this for years and I've finally decided to consult the community here at stackflow. Thanks in advance.

    Read the article

  • Wordpress 3 multi-site install

    - by mike
    Hello, Trying to figure out if this is possible... My company has a cms product that was written in Java and we decided to use Wordpress to run blogs for our clients. Obviously, Wordpress does not run on tomcat(at least not by default) so we installed Pound(http://www.apsis.ch/pound/) on our server and have setup any Apache and Tomcat on different ports. When "/blog/" is requested, the request is directed to Apache. This works fine but we would like to use Wordpress multi site so that we can manage all the blogs from a single interface. We would also like the url for every site to be "/blog/" example: http://www.site1.com/blog/ http://www.site2.com/blog/ I'm thinking it would have to be done with apache??? Is it even possible? Thanks!

    Read the article

  • How do I make virtual host DirectoryIndex file appear in the url?

    - by Bob Flemming
    I have setup a virtual host which specifies a default file to load when the URL is called. The problem I have is that I need that default DirectoryIndex file to appear in the URL. So when I go to: www.mysite.co.uk, I want www.mysite.co.uk/app.php to appear in the URL. How can I achieve this using my virtual host configuration within my apache.conf file? Here is my current code: <VirtualHost *:80> ServerName *.mysite.co.uk DocumentRoot "/var/www/html/mysite/web/" DirectoryIndex app.php </VirtualHost>

    Read the article

  • New: Online NetBeans 8 Crash Course

    - by Geertjan
    On Twitter today I came across an announcement for a brand new on-line course in NetBeans 8. Since NetBeans 8 has been released during the past few months, the course is really very new. Go here to get there directly: https://www.video2brain.com/de/videotraining/netbeans-ide-8-0-crashkurs Here's the general idea. As you can see, the course is in German. With my basic understanding of German, I've had no problem in following the course. The trainer speaks clearly and slowly and everything is very well structured. The course covers all the basics of NetBeans IDE. From getting set up to using all the key features. The quality of the videos is great and the content is clear and informative. Once you've bought the course, all the lessons are unlocked. As you can see, they're all quite short and there's really a lot of content, didn't all fit into the screenshot: Quite some work must have gone into this. Here's one of the free lessons in the course, to give an idea of what you'll get: https://www.video2brain.com/de/tutorial/texte-internationalisieren This one is also free: https://www.video2brain.com/de/tutorial/eclipse-projekt-importieren I highly recommend this course especially if you're switching, or thinking about switching, from a different IDE and want to get a thorough overview of all the features that NetBeans IDE provides. Everything in the course is done within NetBeans, which means no slides, just code. You get to see the workflow of all the standard tasks and, for these purposes, the course does a really great job.

    Read the article

  • htaccess: how to rewrite to clean urls and redirect old urls to the new clean ones?

    - by Sebastian
    With htaccess I'm trying to make my sites urls clean. I use very basic urls like: www.mysite.com/pagename.php ("pagename" is variable). I want www.mysite.com/pagename to display the content of /pagename.php So this is in my htaccess-file now: Options +FollowSymlinks RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^([^\.]+)$ $1.php [NC,L] But I also want my old urls (/pagename.php), when called, to be rewritten to www.mysite.com/pagename How to do this? I can't figure it out (get loops all the time)... Thanks in advance!

    Read the article

  • .htaccess causes 403 error

    - by erdomester
    I have a working website on a free shared server. I decided to hire a dedicated server and purchase a domain for my website. I started uploading the files but things aren't working the way they should. First of all .htaccess is not working, however I set AllowOverride from None to All in /etc/apache2/sites-available/default DocumentRoot /var/www <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> I restarted the server of course. I enabled mod_rewrite: a2enmod rewrite and restarted the server. This change causes a 403 forbidden access error which I am unable to work out. If I change the All back to None then .htaccess is ignored so instead of loading the website the file hierarchy is loaded (the main page is index4.php which should be opened by .htaccess). If I rename index4.php to index.php the website loads, just fyi. The permissions on the file is 600. If I change it to 444 I get 500 Internal Server Error. I checked the logs and I see many errors of this: Permission denied: file permissions deny server access: /var/www/index.html

    Read the article

  • mod_ReWrite to remove part of a URL

    - by Jack
    Someone has incorrectly linked to some of my urls causing 404 erros in Google Webmaster Tools. Here is an example Linked URL: http://www.example.com/foo-%E2%80%8Bbar.html Correct URL: http://www.example.com/foor-bar.html I would like to 301 redirect any instance of this kind of incorrect linking to the correct URL. I have tried the following but it generates 404 Errors site wide. Options +FollowSymLinks RewriteEngine on RewriteRule ^foo-(.*)bar\.html$ http://www.example.com/foo-bar\.html? [L,R=301] Could anyone let me know what I am doing wrong?

    Read the article

  • Allowing users in from an IP address without certificate client authentication

    - by John
    I need to allow access to my site without SSL certificates from my office network and with SSL certificates outside. Here is my configuration: <Directory /srv/www> AllowOverride All Order deny,allow Deny from all # office network static IP Allow from xxx.xxx.xxx.xxx SSLVerifyClient require SSLOptions +FakeBasicAuth AuthName "My secure area" AuthType Basic AuthUserFile /etc/httpd/ssl/index Require valid-user Satisfy Any </Directory> When I'm inside network and have certificate - I can access. When I'm inside network and haven't certificate - I can't access, it requires certificate. When I'm outside network and have certificate - I can't access, it shows me basic login screen When I'm outside network and haven't certificate - I can't access, it shows me basic login screen and following configuration works perfectly <Directory /srv/www> AllowOverride All Order deny,allow Deny from all Allow from xxx.xxx.xxx.xxx AuthUserFile /srv/www/htpasswd AuthName "Restricted Access" AuthType Basic Require valid-user Satisfy Any </Directory>

    Read the article

  • /var/plone ownership

    - by jake
    could anyone help me on my problem, my /var/plone folder's ownership was accidentally changed. can anyone tell me who the owner of this folder? could it be www:root or www-data:root or root:root? btw i changed it to www:root and some images and styles cannot be loaded on my site. Everytime somebody opens the site, it keeps loading and loading and after a few minutes it stops and some images are not loaded and style are not well displayed as it is supposed to be. thanks

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >