Search Results

Search found 29495 results on 1180 pages for 'cross site scripting'.

Page 37/1180 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • Install NPM Packages Automatically for Node.js on Windows Azure Web Site

    - by Shaun
    In one of my previous post I described and demonstrated how to use NPM packages in Node.js and Windows Azure Web Site (WAWS). In that post I used NPM command to install packages, and then use Git for Windows to commit my changes and sync them to WAWS git repository. Then WAWS will trigger a new deployment to host my Node.js application. Someone may notice that, a NPM package may contains many files and could be a little bit huge. For example, the “azure” package, which is the Windows Azure SDK for Node.js, is about 6MB. Another popular package “express”, which is a rich MVC framework for Node.js, is about 1MB. When I firstly push my codes to Windows Azure, all of them must be uploaded to the cloud. Is that possible to let Windows Azure download and install these packages for us? In this post, I will introduce how to make WAWS install all required packages for us when deploying.   Let’s Start with Demo Demo is most straightforward. Let’s create a new WAWS and clone it to my local disk. Drag the folder into Git for Windows so that it can help us commit and push. Please refer to this post if you are not familiar with how to use Windows Azure Web Site, Git deployment, git clone and Git for Windows. And then open a command windows and install a package in our code folder. Let’s say I want to install “express”. And then created a new Node.js file named “server.js” and pasted the code as below. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7: 8: console.log("Web application opened."); 9: app.listen(process.env.PORT); If we switch to Git for Windows right now we will find that it detected the changes we made, which includes the “server.js” and all files under “node_modules” folder. What we need to upload should only be our source code, but the huge package files also have to be uploaded as well. Now I will show you how to exclude them and let Windows Azure install the package on the cloud. First we need to add a special file named “.gitignore”. It seems cannot be done directly from the file explorer since this file only contains extension name. So we need to do it from command line. Navigate to the local repository folder and execute the command below to create an empty file named “.gitignore”. If the command windows asked for input just press Enter. 1: echo > .gitignore Now open this file and copy the content below and save. 1: node_modules Now if we switch to Git for Windows we will found that the packages under the “node_modules” were not in the change list. So now if we commit and push, the “express” packages will not be uploaded to Windows Azure. Second, let’s tell Windows Azure which packages it needs to install when deploying. Create another file named “package.json” and copy the content below into that file and save. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*" 6: } 7: } Now back to Git for Windows, commit our changes and push it to WAWS. Then let’s open the WAWS in developer portal, we will see that there’s a new deployment finished. Click the arrow right side of this deployment we can see how WAWS handle this deployment. Especially we can find WAWS executed NPM. And if we opened the log we can review what command WAWS executed to install the packages and the installation output messages. As you can see WAWS installed “express” for me from the cloud side, so that I don’t need to upload the whole bunch of the package to Azure. Open this website and we can see the result, which proved the “express” had been installed successfully.   What’s Happened Under the Hood Now let’s explain a bit on what the “.gitignore” and “package.json” mean. The “.gitignore” is an ignore configuration file for git repository. All files and folders listed in the “.gitignore” will be skipped from git push. In the example below I copied “node_modules” into this file in my local repository. This means,  do not track and upload all files under the “node_modules” folder. So by using “.gitignore” I skipped all packages from uploading to Windows Azure. “.gitignore” can contain files, folders. It can also contain the files and folders that we do NOT want to ignore. In the next section we will see how to use the un-ignore syntax to make the SQL package included. The “package.json” file is the package definition file for Node.js application. We can define the application name, version, description, author, etc. information in it in JSON format. And we can also put the dependent packages as well, to indicate which packages this Node.js application is needed. In WAWS, name and version is necessary. And when a deployment happened, WAWS will look into this file, find the dependent packages, execute the NPM command to install them one by one. So in the demo above I copied “express” into this file so that WAWS will install it for me automatically. I updated the dependencies section of the “package.json” file manually. But this can be done partially automatically. If we have a valid “package.json” in our local repository, then when we are going to install some packages we can specify “--save” parameter in “npm install” command, so that NPM will help us upgrade the dependencies part. For example, when I wanted to install “azure” package I should execute the command as below. Note that I added “--save” with the command. 1: npm install azure --save Once it finished my “package.json” will be updated automatically. Each dependent packages will be presented here. The JSON key is the package name while the value is the version range. Below is a brief list of the version range format. For more information about the “package.json” please refer here. Format Description Example version Must match the version exactly. "azure": "0.6.7" >=version Must be equal or great than the version. "azure": ">0.6.0" 1.2.x The version number must start with the supplied digits, but any digit may be used in place of the x. "azure": "0.6.x" ~version The version must be at least as high as the range, and it must be less than the next major revision above the range. "azure": "~0.6.7" * Matches any version. "azure": "*" And WAWS will install the proper version of the packages based on what you defined here. The process of WAWS git deployment and NPM installation would be like this.   But Some Packages… As we know, when we specified the dependencies in “package.json” WAWS will download and install them on the cloud. For most of packages it works very well. But there are some special packages may not work. This means, if the package installation needs some special environment restraints it might be failed. For example, the SQL Server Driver for Node.js package needs “node-gyp”, Python and C++ 2010 installed on the target machine during the NPM installation. If we just put the “msnodesql” in “package.json” file and push it to WAWS, the deployment will be failed since there’s no “node-gyp”, Python and C++ 2010 in the WAWS virtual machine. For example, the “server.js” file. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7:  8: var sql = require("msnodesql"); 9: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:tqy4c0isfr.database.windows.net,1433;Database=msteched2012;Uid=shaunxu@tqy4c0isfr;Pwd=P@ssw0rd123;Encrypt=yes;Connection Timeout=30;"; 10: app.get("/sql", function (req, res) { 11: sql.open(connectionString, function (err, conn) { 12: if (err) { 13: console.log(err); 14: res.send(500, "Cannot open connection."); 15: } 16: else { 17: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 18: if (err) { 19: console.log(err); 20: res.send(500, "Cannot retrieve records."); 21: } 22: else { 23: res.json(results); 24: } 25: }); 26: } 27: }); 28: }); 29: 30: console.log("Web application opened."); 31: app.listen(process.env.PORT); The “package.json” file. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*", 6: "msnodesql": "*" 7: } 8: } And it failed to deploy to WAWS. From the NPM log we can see it’s because “msnodesql” cannot be installed on WAWS. The solution is, in “.gitignore” file we should ignore all packages except the “msnodesql”, and upload the package by ourselves. This can be done by use the content as below. We firstly un-ignored the “node_modules” folder. And then we ignored all sub folders but need git to check each sub folders. And then we un-ignore one of the sub folders named “msnodesql” which is the SQL Server Node.js Driver. 1: !node_modules/ 2:  3: node_modules/* 4: !node_modules/msnodesql For more information about the syntax of “.gitignore” please refer to this thread. Now if we go to Git for Windows we will find the “msnodesql” was included in the uncommitted set while “express” was not. I also need remove the dependency of “msnodesql” from “package.json”. Commit and push to WAWS. Now we can see the deployment successfully done. And then we can use the Windows Azure SQL Database from our Node.js application through the “msnodesql” package we uploaded.   Summary In this post I demonstrated how to leverage the deployment process of Windows Azure Web Site to install NPM packages during the publish action. With the “.gitignore” and “package.json” file we can ignore the dependent packages from our Node.js and let Windows Azure Web Site download and install them while deployed. For some special packages that cannot be installed by Windows Azure Web Site, such as “msnodesql”, we can put them into the publish payload as well. With the combination of Windows Azure Web Site, Node.js and NPM it makes even more easy and quick for us to develop and deploy our Node.js application to the cloud.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Google bots are severely affecting site performance

    - by Lynn
    I have an aggregate site on a linux server that pulls in feeds from a universe of about 2,000 blogs. It's in Wordpress 3.4.2 and I have a cron job that is staggered to run five times an hour on another server to pull in the stories and then publish them to the front page of this site. This is so I didn't put too much pressure all on one server. However, the Google bots, which visit a few times every hour bring the server to its knees in the morning and evenings when there is an increase in traffic on the site. The bots have something like 30,000 links to follow at this point. How do I throttle the bots to simply grab the new stories off the front page and stop there? EDIT- Details of my server configuration: The way we have this set up is the server that handles all the publishing is an unmanaged instance via AWS. It mounts the NFS server and connects to the RDS to update content, etc. You get to this publishing instance via a plugin that detects the wp-admin link and then redirects you into there. The front end app server also mounts the NFS and requests data from the RDS. It is the only one that has the WP Super Cache on it.... The OS is Ubuntu on the App server and the NFS runs CentOs. The front end is Nginx and the publishing server is Apache.

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • Why old (301) links stay on Google when breaking site down to multiple domains

    - by Sampo Sarrala
    Some background: We did have single site and single domain (let's call it mainsite.com) with product information, however things have changed since and product database has grown fast. So we decided to move some major products/manufacturers under their own domains (let's call one of them subsite.com) while still using our main database/codebase. What we've done: Added subsite.com domain for product 1 by Great Products Co. Some new nice looking front pages, info pages, etc. Detail pages that will use information from original db. Redirected product/group links from mainsite.com using 301 redirect. Verified that redirects works as expected. Waited some time for Google reindexing (over 30 days, I've heard it should be more than enough). Results: If I search our moved products from Google then it will found them and list them but with old links to our main page like mainsite.com/group/product1 but it should show link to new site subsite.com/product1. Links from Goole redirects as they should, as said redirects are verified [301]. Main question: Any reasons why Google would not follow 301 redirects and update links so that they will point to our new mfg/product site subsite.com?

    Read the article

  • Scripting an 'empty' password in /etc/shadow

    - by paddy
    I've written a script to add CVS and SVN users on a Linux server (Slackware 14.0). This script creates the user if necessary, and either copies the user's SSH key from an existing shell account or generates a new SSH key. Just to be clear, the accounts are specifically for SVN or CVS. So the entry in /home/${username}/.ssh/authorized_keys begins with (using CVS as an example): command="/usr/bin/cvs server",no-port-forwarding,no-agent-forwarding,no-X11-forwarding,no-pty ssh-rsa ....etc...etc...etc... Actual shell access will never be allowed for these users - they are purely there to provide access to our source repositories via SSH. My problem is that when I add a new user, they get an empty password in /etc/shadow by default. It looks like: paddycvs:!:15679:0:99999:7::: If I leave the shadow file as is (with the !), SSH authentication fails. To enable SSH, I must first run passwd for the new user and enter something. I have two issues with doing that. First, it requires user input which I can't allow in this script. Second, it potentially allows the user to login at the physical terminal (if they have physical access, which they might, and know the secret password -- okay, so that's unlikely). The way I normally prevent users from logging in is to set their shell to /bin/false, but if I do that then SSH doesn't work either! Does anyone have a suggestion for scripting this? Should I simply use sed or something and replace the relevant line in the shadow file with a preset encrypted secret password string? Or is there a better way? Cheers =)

    Read the article

  • Advice on software infrastructure for a FLOSS bounty site

    - by michaeljt
    I am planning to set up a simple web site where people can offer bounties for work on FLOSS projects. Unfortunately I have no experience at web development (I am a C/C++ developer), so I was hoping someone might be able to suggest out-of-the-box packages (preferably Debian ones) I could use to build the site from. My idea of how the site would work is to keep things as simple as possible. The person proposing a bounty would enter a description with relevant links (particularly to a bugtracker entry with the project the work is to be done on, where the real discussion and work would take place) and information and place an initial contribution. Other people would be able to add (donate, not pledge) contributions, but any discussion would take place on the project's bugtracker. I am also planning to run a mailing list rather than a forum (at least initially), so that is not a requirement. Paypal seems to me to be the handiest payment mechanism. So overall what I need is probably a simple interface with Paypal integration and a simple database backend. I hope this is the right place for my question, if not I would be grateful for pointers to somewhere better. And of course, this is purely about the technical side, though I am more than happy to discuss other aspects of the project elsewhere.

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

  • New site not appearing in index after change of address, no feedback from google webmaster tools

    - by Duffy
    Our change of address seems to not be taking effect. Here's the story so far: We're a web company and our product is called The New Hive. Our site used to be at thenewhive.com, but we decided to switch to newhive.com (drop the "the", it's cleaner). So the timeline of what I've tried, starting on July 29th: used 301 redirects for all pages (e.g. thenewhive.com/tag/art = newhive.com/tag/art) At this point we noticed that we had disappeared from search results when searching "The New Hive", the front page used to be all links to our site plus a couple news articles about the company. So on August 5th I: verified new domain in webmaster tools (old domain was already verified) submitted a change of address request on August 5th with Webmaster Tools / Configuration / Change of Address Then after another week, on August 13th I did this: Went to Webmaster Tools / Health / Fetch as google fetched our homepage and a couple sub pages, all successfully clicked "Submit to Index" for homepage As of today (August 23rd) we're still not showing up in the index. We're getting no warnings or feedback of any kind from the dashboard so I'm inclined to think something's broken with the dashboard rather than that something's wrong with our site from an SEO perspective. From the dashboard: No new messages or recent critical issues. Crawl Errors: No data available. From Health - Index Status: Total indexed 0 Ever crawled 42,490 Not selected 12 Blocked by robots 0 I'm really at a loss here, any help would be appreciated.

    Read the article

  • How can a website look different in safari Windows and Safari mac?

    - by Jakob
    I have the website http://storkbox.magentodemo.dk . I've been testing crossbrowser on my windows PC, and it looks good in all browsers, but on Mac in Safari it looks like the CSS is not getting interpreted right, or there is a critical javascript error. When I look in the console cross-browser, the error log shows exactly the same. Chrome on mac interprets the site as intended, so why do I have a problem with safari. It is the same across different computers, and iphone safari also shows the site wrong. How is this possible and how do I debug?

    Read the article

  • SEO: Getting site to show in location-specific searches

    - by willvv
    I'm really new to this SEO world and I've been reading a lot to try and figure it out. We have a site moodbond.com that allows users to browse/create events anywhere. And we fill it with content from the main cities in the US. We would like it to show for searches for things like "events in san francisco" or "what to do in new york", however, since the site is not really location-specific, I'm not really sure where to begin. I've been thinking a couple of things, maybe you can help me decide if these would be a good way to start or if I should try something different. 1- Allow something like location-specific urls (e.g. moodbond.com/browse/san-francisco) could just show the main page centered in San Francisco. 2- Change the headers/title of the page so it adapts automatically to the city being browsed (and change this dynamically as the user changes the location of the map). 3- Add internal links to different locations (e.g. add a link at the footer of the page that says "Events in Seattle" that makes the site load events in that city. (this would probably depend on implementing #1). What do you guys think? will any of these really help or should I look for a different approach? any advice is welcome. Thanks

    Read the article

  • Drupal site Instant Messaging [migrated]

    - by pthurmond
    I am trying to find a module or a standalone solution that I can turn into a module that will allow me to have an instant messaging system like Facebook does on a Drupal site that I am working on. I have never setup a chat system before. My particular requirements are rather stringent. It needs to be a solution where we host the chatting server (if one is needed separate from the website itself). It must use the site's login state (can't use an external system at all, that means no GTalk, Yahoo IM, or AIM). It also must be able to handle up to 1,000 users at any given time. I have looked through the Drupal community and I tried the DXMPP module, but it requires Jquery UI 1.8 and that doesn't work with all of the other things that my site uses (such as Homebox). We do have a Jabber server already setup and ready to go. Does anyone have any thoughts or options here? Thanks! EDIT: We are using Drupal 6.

    Read the article

  • How do you handle very old browsers on your site?

    - by Alex
    Hi. We have a non-profit web site that got about 5 million hits in May. Of those, about 5,700 were from IE 5.x or lower; about 4,000 were from folks with Netscape 4.x or lower. We know that the current site's layout works for newer browsers and we're testing it on IE6 as well (along with Chrome, Opera, Safari, and Firefox). How do you handle the folks with the older browsers? Because of jQuery libraries and such, the pages might not function correctly on those old browsers. Is there an easy way to show a text-only version on browsers that can't handle the CSS and jQuery goodies? How do large sites handle this sort of thing? I've used the @embed to hide the stylesheet from Netscape 4.x, but not sure beyond that.

    Read the article

  • How to develop a site for multiple browsers these days ?

    - by Misha Moroshko
    I would like to develop a site that works in all major browsers. I wonder what tools are available these days that may help me to check the functionality across browsers. I mean after I add some functionality to my site, I want to check it in all browsers. Are there any tools/software for this task ? I understand that it's impossible to check everything because it is pretty subjective if something works as expected or not, but maybe there are some tools that may found major errors (like IE is not supporting indexOf).

    Read the article

  • Draw a cross at center of a UIImagePickerController.

    - by VansFannel
    Hello. I'm very new on iPhone development. I'm trying to draw a cross over the image obtained from camera. I'm using a custom ViewController that inherits from UIImagePickerController. When I star the application, I see the cross, but a few seconds after the cross disappears. Should I use cameraOverlayView? Thank you

    Read the article

  • WCF cross-domain policy security error

    - by George2
    Hello everyone, I am using VSTS 2008 + C# + WCF + .Net 3.5 + Silverlight 3.0. I host Silverlight control in an html page and debug it from VSTS 2008 (press F5, then run in VSTS 2008 built-in ASP.Net development web server), then call another WCF service (hosted in another machine running IIS 7.0 + Vista). The WCF service is very simple, just return a constant string to client. When invoking the WCF service from Silverlight, I got the following error message, An error occurred while trying to make a request to URI 'https://LabTest/Test.svc'. This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a cross-domain policy file and to ensure it allows SOAP-related HTTP headers to be sent. This error may also be caused by using internal types in the web service proxy without using the InternalsVisibleToAttribute attribute. Please see the inner exception for more details. Here is the clientaccesspolicy.xml file, anything wrong? <?xml version="1.0" encoding="utf-8" ?> <access-policy> <cross-domain-access> <policy> <allow-from http-request-headers="*"> <domain uri="*"> </domain> </allow-from> <grant-to> <resource path="/" include-subpaths="true"></resource> </grant-to> </policy> </cross-domain-access> </access-policy> thanks in advance, George

    Read the article

  • How To Discover RSS Feeds for a given site.

    - by ktolis
    The quest is, given a site url (say http://stackoverflow.com/ ) to return the list of all the feeds available on the site. Methods acceptable: a) use a 3rd party service (google?, yahoo?, ...) programmatically b) using a crawler/spider (and some tips on how to configure the spider to return the rss/xml feeds only) c) programmatically using c/c++/php (any language/library) The task here is not to get the feeds contained on the page returned by the url but ALL the feeds that are available on the server at any depth... in any cases please provide a simple usage example.

    Read the article

  • SharePoint Content and Site Editing Tips

    - by Bil Simser
    A few content management and site editing tips for power users on this bacon flavoured unicorn morning. The theme here is keep it clean!Write "friendly" email addressesRemember it's human beings reading your content. So seeing something like "If you have questions please send an email to [email protected]" breaks up the readiblity. Instead just do the simple steps of writing the content in plain English and going back, highlighting the name and insert a link (note: you might have to prefix the link with mailto:[email protected]). It makes for a friendlier looking page and hides the ugliness that are sometimes in email addresses.Use friendly column and list namesThis is a big pet peeve of mine. When you first create a column or list with spaces the internal name is changed. The display name might be "My Amazing List of Animals with Large Testicles" but the internal (and link) name becomes "My_x00x20_Amazing_x00x20_List_x00x20_of_x00x20_Animals_x00x20_with_x00x20_Large_x00x20_Testicles". What's worse is if you create a publishing page named "This Website is Fueled By a Dolphin's Spleen". Not only is it incorrect grammar, but the apostrophe wreaks havoc on both the internal name for the list (with lots of crazy hex codes) as well as the hyperlink (where everything is uuencoded). Instead create the list with a distinct and compact name then go back and change it to whatever you want. The end result is a better formed name that you can both script and access in code easier.Keep your Views CleanWhen you add a column to a list or create a new list the default is to add it to the default view. Do everyone a favour and don't check this box! The default view of a list should be something similar to the Title field and nothing else. Keep it clean. If you want to set a defalt view that's different, go back and create one with all the fields and filtering and sorting columns you want and set it as default. It's a good idea to keep the original AllItems.aspx (note the lack of space in the filename!) easy and unfiltered. It's also a good idea to keep your column count down in views. Don't let every column be added by default and don't add every column just because you can. Create separate views for distinct responsibilities and try to keep the number of columns down to a single screen to prevent horizontal scrolling.Simple NavigationThe Quick Launch is a great tool for navigating around your site but don't use the default of adding all lists to it. Uncheck that box and keep navigation simple. Create custom groupings that make sense so if you don't have a site with "Documents and Lists" but "Reports and Notices" makes more sense then do it. Also hide internal lists from the Quick Launch. For example, if most users don't need to see all the lookup tables you might have on a site don't show them. You can use audience filtering on the Quick Launch if you want to hide admin items from non-admin users so consider that as an option.Enjoy!

    Read the article

  • What would you take into account when you were asked to compare software? [closed]

    - by mstaessen
    For my master's thesis, I am asked to make a comparative study of frameworks for cross-platform mobile development. I want to eliminate the chances of having missed something in my comparison. This is why I want to ask what YOU would value (most) when comparing such frameworks (Like for instance PhoneGap and Appcelerator Titanium). Performance, capabilities and licensing are kind of obvious, but can you think of others?

    Read the article

  • Straight cable or cross over cable?

    - by om sai
    I have pppoe internet connection. My ISP provides connections like this: ISP->media converter(fiber)->8 port switch(TP Link TL-SF1008D)->to individual internet connection account holders (like me) Now the connection between the media converter and the switch is done using 4 pair cross over cat5 cable and I would like to connect the cable running from the switch to the router and through router to my PC. So what type of cable should I use to make the connection between the switch and the router (straight thru cable or cross over cable)? The point I am trying to make is I am able to connect to the internet using straight through cable between the switch and my PC but when I connect the same cable between the switch and the router and from router to my PC I am not able to connect to the internet. Also, if I am using the 2 pair cable (instead of 4) between the switch and the router I am able to connect to the internet but same is not true in case of 4 pair cat5 cable.

    Read the article

  • Apache virtual host for drupal test site

    - by bsreekanth
    Hello, I am a programmer, trying to launch my first website.. through different helpful posts in sf and others, I setup an account with Linode and set up a slice (Debian, Apache, ..etc). I have a Drupal site under development, and like to have a test site in the Linode server as well. Now, I like to have a site setup with the following requirement. What is the best way to setup and protect the test site along with the actual (production) site?. Is virtual host is the answer? To protect the test site, is .htaccess authentication sufficient to prevent access from public and robots? I also modifying the theme, database contents etc, so having two sites under one drupal installation may not be good idea . what do u suggest? thanks in advance. bsreekanth.

    Read the article

  • "Server not found" for live version of site

    - by user1491819
    I can access my local dev site on my local pc, eg: http://mysite But I cannot access the live site, even though it works fine on other pc's: http://www.mysite.com The live site gives the error in Firefox: Server Not Found. Pinging www.mysite.com gives the error:"Ping request could not find host www.mysite.com" hosts file: 127.0.0.1 mysite I changed the hosts file to the following and rebooted: 127.0.0.1 mysitedev I'm running on XP, and have cleared the DNS cache using: ipconfig /flushdns I have verified the live site is up using: http://www.isup.me/ and the site loads fine using my phone. What could be preventing my local pc from accessing the live site?

    Read the article

  • Configure IIS to rewrite IP Address to Site Name

    - by Bath Man
    So i've started my first web site from home, and I'm trying to get it up and running and google crawlable and the like, but I can't seem to figure out how to have my site name returned in the address bar instead of my IP address. I've purchased a domain name for my site on Godaddy and then set it to redirect to my site. When you type in the domain name, you get redirected to http://0.0.0.0/default.aspx (not my real IP obviously), and that stays in the user's address bar. In order to fix that temporarily, I've set up masking on Go Daddy which keeps the URL in the address bar, but just shows my website in a frame. This is fine for users visiting the site, however any kind of automated robot such as GoogleBot cannot discover my content because of the frame. I've looked into ISAPI filters and server-site-rewriting, and the like... but I just can't quite figure out how to do what I need it to do. Any simple suggestions or links would be appreciated.

    Read the article

  • Cross-match a number of worksheets to one master worksheet

    - by Carter
    Hopefully the title is not too confusing. Basically, I have a master list of addresses and those addresses are listed in multiple columns (Column A - street number, Column B - street name, Column C - street type etc) and I get a another set of addresses on a daily basis with the same address formatting. What I need to do is cross-match the daily changing list of addresses to the first list to remove any matching entries. So, for example, if the first list has 123 Main St on it, I have to ensure that there are no entries of 123 Main St on any subsequent daily lists. I'm using one address as an example but the lists contain upwards of 10000 addresses that have to be cross matched. I don't need them flagged or highlighted, just deleted from the daily lists (though if they have to be flagged or highlighted, I could work with that) Any help here would be much appreciated.

    Read the article

  • Why is there still so much offer for Perl programmers?

    - by user491444
    A quick search on monster.com on different scripting languages resulted on Perl having much more job opportunities than Python and Ruby (in Europe, I didn't check for the rest of the world), and since I'm just a newbie programmer I was wondering why is this? I've read everywhere that Python and Ruby are much better languages, and much more organized. Having coded in python and php myself, Perl's code seems so alien to me. Anyways, sorry for my poor English, it's my second language, and this is not a critique on the Perl language, I was just wondering whether it's a good idea to learn it at this point or not.

    Read the article

  • C library build system dependencies

    - by Ninefingers
    Hello all, This debate has cropped up on a mailing list for a project I'm involved in. Unfortunately we're quite a small bunch at the moment, so I want to ask a wider audience. We're writing a C library (for arbitrary precision arithmetic) and are investigating build systems. Currently we have a bash script in desperate need of work. I believe we can't use autotools etc due to licensing (bsd vs gpl). So I suggested we use a modern scripting language like python or perl. The question is: is having something like perl or python around at build time an unrealistic dependency on Unix-like platforms these days?

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >