Search Results

Search found 24376 results on 976 pages for 'site crawler'.

Page 32/976 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • Search ranking for important keywords has gone down drastically [duplicate]

    - by Vaivhav
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers Firstly, we are a small entrepreneurial team of 3 persons and I am more like an amateur webmaster of the company's website as we cannot really afford a technical guy/department right now. A few weeks earlier, our website traffic and rankings for most keywords decreased overnight. I did a lot of reading henceforth and learned about Penguin 2.1 which people said is the reason for the drop. Something like this had never happened before. Now, I have gone through the entire Google webmaster help section. It says there that if a manual penalty is taken against us, we would notice a message in Manual Actions page. So far, we haven't received any notice from Google for web spam. Some SEO guys I contacted said they found spam links in our backlink profile. I do believe I had mistakenly purchased a cheap link/SEO scheme when I was yet very new to SEO. This was more than a year back but since then we have been legitimate. Moreover, how do I find out which is a spam link and which is not? Our content is all original, refreshing and the best you will find in our niche. We also have a blog but on a different domain (wordpress.com) from where we send out anchored links to our business website. Is this a good thing to do? Now, how should we proceed and recover our traffic/rankings. I tried searching in webmasters for a way to reach google and ask them why the traffic has decreased suddenly, but I couldn't find a contact form or something. Can someone please go through our website and help in making things more clear regarding the reason for the drop, along with a solution. Will really appreciate this as I can't get to figure this out and its taking a lot of time. Vaivhav

    Read the article

  • Website signaled as containing malware

    - by Bakaburg
    I've got a nasty problem with one of our websites. It has been signaled to us by Google and other agencies that it contains malware. We weren't able to understand how to cope with the problem. Could anyone drive us in the right direction? UPDATE: I used google webmaster tools to review the suspicious website. And now it says it's ok! Even if I didn't change anything! How could it be? false alarm?

    Read the article

  • Is Azure Compatible with JPEG XR?

    - by Shawn Eary
    I just put an F#/MVC app into a Windows Azure solution as a Web Role. Before migration, my JPEG XR (*.WDP) files were getting displayed on the client in IE9 without issue via my local and hosted sites. Now, after migration into Windows Azure, my JPEG XR files neither get displayed in my local Windows Azure compute emulator nor do they get displayed when they are deployed to http://*.cloudapp.net. Is there some sort of conflict with Widows Azure and (JPEG XR) *.wdp files? If so, what is the accepted best practice for overcoming this conflict?

    Read the article

  • Should users be deleted after inactivity on a website?

    - by Hovaness Bartamian
    When you have a social website or a website where you can register, would you eventually delete them after a certain time (after a year of inactivity) or would you rather keep their account records for ever? I know websites like Facebook have large amount of inactive, duplicated and fake accounts. So I'm wondering if after two years of inactivity it would be alright to send the account a warning email of deletion unless they log in. Just thinking of a clean and efficient database management or any implications this may cause to new potential users.

    Read the article

  • Self-imposed lockout from program

    - by Alex
    I'm plagued with a lack of willpower. I recently started looking for solutions, and came across a program for macs called SelfControl which completely blocks one's access to a given set of websites for a given period of time (you can delete the program/restart your computer/do almost anything and it will still block those sites for the specified time period, and doesn't require a password to do it.) Unfortunately, there are no windows analogues. The one that comes the closest is Cold Turkey. It has the functionality whereby you set a time in the future, specify a list of websites (or programs - eg explorer, firefox, chrome) and you are blocked from accessing them for the whole duration. No password can undo it, no system reboot, etc. The problem is that the program is a buggy piece of garbage, and in order to ensure that you're not locked out from websites forever, you have to run an uninstaller which is just an exe file accessible at any time which completely defeats the purpose of a self-imposed program lockout. I want to make a better version of that program, or find a simple way to prevent access to a given set of programs over a given period of time with no way around it. I've only taken a few introductory courses in java (math major), but the internet is really having a negative effect on my studies, and the only way I can do work is to eliminate all distractions. What do I need to learn in order to make a program with the following properties: Given a set of .exe files, and a time in the future , this program will prevent access to the given .exe files until current time = given time restarting the computer doesn't interfere with the program, one can't uninstall the program until current time = given time, one can't create another instance of the program to block itself I don't care how much programming knowledge i need to acquire in order to make this program, so please give me a specific list of things that I need to study in order to make this happen, or if something like this exists, then please let me know.

    Read the article

  • Deploying my first website!

    - by test
    I have built a data driven website - an asp.net website, using the entity framework. In my solution I have 4 projects - the web application PresentationLayer, and 3 class libraries - Data Layer, Business and Common Layer. In one of these libraries, Common Layer I have my Model (MyModel.edmx). I have always tested my application on Cassini - Asp.Net Development Server. I have never touched IIS in my life. I bought a domain and hosting on go daddy. My logic tells me to grab my four folders (1 for each layer) and simply move them to the root folder. But I know I'm wrong since then the home page would be mywebsite.org/presentationlayer/default.aspx and second of all I start getting a bunch of errors where files do not load or they are not found. I also know that I need to manage the web.config but I don't have any experience where to start. I'm not sure if this is a problem but I also have a web service included in my presentation layer.

    Read the article

  • Javascript Only Search Method [on hold]

    - by user2118228
    I need to put a search function on a website that is going to be on a CD-ROM with no access to the internet. It has 80 pages, and about 500 'items', so I'd prefer to not have to hard code 100's of 'if statements if possible. I've found a few programs you can buy that will index and generate results (Zoom Search, JSS Index, The German Guys') but there are odd quirks with each one. Plus I would rather code it myself to get complete control over it, and to really understand what it's doing. Basically searching for a few words would display the product image and description; clicking on that would take you the related URL. This is kind of complicated, I can't find an easy solution not dealing with hundreds of if Statements. Has anyone ever created anything like this or know a better method? I'm not really sure a better way to go about this. I've used PHP/MYSQL for search results before, but this cannot run any php.

    Read the article

  • How to grep (or find) on cPanel?

    - by San
    How can I search for a specific string (function name or a variable name) in my files which are in various directories under cPanel file manager? I have been using a library directory and functions on that directory are used in various apps and pages. Now, I am in a situation to change something in the library file, for which I need to know the impact on files which use this library file functions. How to search / find / grep through the files hosted?

    Read the article

  • What are the best measures to protect content from being crawled?

    - by Moak
    I've been crawling a lot of websites for content recently and am surprised how no site so far was able to put up much resistance. Ideally the site I'm working on should not be able to be harvested so easily. So I was wondering what are the best methods to stop bots from harvesting your web content. Obvious solutions: Robots.txt (yea right) IP blacklists What can be done to catch bot activity? What can be done to make data extraction difficult? What can be done to give them crap data? Just looking for ideas, no right/wrong answer

    Read the article

  • FBA site owner encounter access deny in sharepoint 2007

    - by intangible02
    I created a sharepoint 2007 publishing site first using windows authentication, then extended it to another site using FBA. I created a FBA user and set it as site collection admin as well as top site owner. I also make application pool which the FBA site is running in to run with a user account which is within administrator group. But I encounter access deny error when browsing certain links using this site owner account. Is there other settings I need to configure? I found in the web.config, the impersonation is set to true. How does this affect the access rights?

    Read the article

  • How to Add a File from my source tree to Maven Site

    - by Charles O.
    I have a Maven 2 RESTful application using Jersey/JAXB. I generate the JAXB beans from a schema file, where the schema file is in my resources directory, e.g., src/main/resources/foo.xsd. I want to include foo.xsd file in the generated Maven site for my project, so that clients can see the XML schema when writing RESTful calls. How can I include foo.xsd in the site? I could have a copy of the file in src/main/site/..., and then update my site.xml to point to it (or have a .apt whose contents point to it), but I don't like that because I'm still tweaking foo.xsd, and don't want to have to remember to copy it each time I update it. And that's just bad practice. I also tried having a .apt file that has a link to the foo.xsd which gets copied to the target/classes directory. That works until I do a site:deploy, because that only copies the target/site directory. Thanks, Charles

    Read the article

  • Create list in existing site collection from a feature

    - by keysersoze
    I have created a feature, a publishing site, in Visual Studio to MOSS - this feature contains a masterpage, some page-templates, some site columns (grouped to match each page-template) and som custom list templates etc. I have also created a site collection, some sites and pages based on my feature. Now I have upgraded the code in my feature - I wanted a ListInstance to be created based on my custom list template. When I have upgraded my SharePoint (using WSPBuilder), the ListInstance and default data are visible if I create a new site collection, but existing site collection does not get the ListInstance and data. Is there anything I can do to update existing site collections to contain the ListInstance when upgrading?

    Read the article

  • Why .NET ASMX web service on secure.site.com can't be called from www.site.com?

    - by user118657
    Hello, We have a web service on https://secure.site.com/service.asmx it works fine from https://secure.site.com/consumer.html but when we try to use it from https://www.site.com/consumer.html we can't do it. Getting 403 error. I'ts probably something related to webservice security (because of different subdomains) but I can't figure out what. How to make https://secure.site.com/service.asmx be accessible from https://www.site.com/consumer.html? Update: Calling webserivce using JQuery Ajax. $.ajax({ type: "POST", url: "https://secure.site.com/service.asmx/method", data: {}, dataType: "xml", success: method_result, error: AjaxFailed }) ; Thanks.

    Read the article

  • Sharepoint: How to obtain the current site/web/list properly

    - by driAn
    Hi all What is the best way to obtain the current site/web/list ? Option 1 - Reusing existing objects SPSite site = SPContext.Current.Site; SPweb web = SPContext.Current.Web; SPList list = SPContext.Current.List; Option 2 - Creating new objects SPSite site = new SPSite(SPContext.Current.Site.ID); // dispose me SPweb web = site.OpenWeb(SPContext.Current.Web.ID); // dispose me SPList list = web.Lists[SPContext.Current.List.ID]; I experienced problems when using option 1 in some situations. Since then I chose the 2nd option and it worked fine so far. What is your opinion on this? I is generally better to go with option 2? Other suggestions?

    Read the article

  • Keeping asp.net mvc site IIS6 always ready to accept requests

    - by Andrew Florko
    I have asp.net mvc intranet site that is deployed to IIS6. Site is used rarely so app pool tends to shutdown. When user click the page for the very first time 5-10 seconds are passed till page appears (app pool started and site is compiled). Situation repeats for the next page and so on. AFAIK IIS7 has option to disable App pool shutdown but IIS6 lacks it. Nowadays i have special utility that pings site periodically (10 pages) in order to determine if pages are available and keeps site always ready for users this way. Is it normal or may be I've missed something in IIS6 configuration? Do you use such pinger apps in production to notify support/admins if site is not available? Thank you in advance!

    Read the article

  • one codeigniter controller named site needs to handle multiple domains

    - by Mauricio Webtailor
    Got a controller in codeigniter who handles different sub sites. site/index/1 fetches content for subsite a site/index/2 fetches content for subsite b Now we decided to register domain names for these sub sites. so what we need: http://www.subsite1.com - default controller should be site/index/1 without the site/index/1 in the uri http://www.subsite2.com - default controller should be site/index/2 without the site/index/2 in the uri I fiddled and tried to play with routes.php but getting nowhere.. Can somebody point me in the right direction?

    Read the article

  • WordPress > microsites use main site's menu (same domain, multiple subdirectories, multiple WP insta

    - by Scott B
    I have a main site at site.com and several subdirectory "microsites" at site1.site.com, site2.site.com, etc. These are all on the same server. Each site is set up in its own folder under public_html and each with its own separate wordpress install. I'd like for each microsite to share the same top level menu (the page's menu) with the main site. I'm sure there are several approaches and I'd like to ask you for a few ideas. As an aside, I'd also like to ask if the new WordPress 3.0 beta would make this simpler to do (since it combines wordpress MU into the main wordpress core)

    Read the article

  • Problems with viewing site in Internet Exploder

    - by Kevin
    I built a site and I'm just about finished. It displays properly in all the browsers I have (Safari, Chrome, and Firefox) but my client is not computer savvy at all and still uses Internet Explorer, so that's all he's using to view the site. I don't have IE to test the site so I've been using BrowserStack.com and I see in IE that the site is broken. The navigation bar has a white background and is pushed down a line, and the logo isn't appearing. Could anybody please assist me with figuring out why the site isn't displaying properly in IE, and how to fix it? Help is greatly appreciated. Thanks Site: WebuildCAhomes dot com

    Read the article

  • Include weather information in ASP.Net site from weather.com services

    - by sreejukg
    In this article, I am going to demonstrate how you can use the XMLOAP services (referred as XOAP from here onwards) provided by weather.com to display the weather information in your website. The XOAP services are available to be used for free of charge, provided you are comply with requirements from weather.com. I am writing this article from a technical point of view. If you are planning to use weather.com XOAP services in your application, please refer to the terms and conditions from weather.com website. In order to start using the XOAP services, you need to sign up the XOAP datafeed. The signing process is simple, you simply browse the url http://www.weather.com/services/xmloap.html. The URL looks similar to the following. Click on the sign up button, you will reach the registration page. Here you need to specify the site name you need to use this feed for. The form looks similar to the following. Once you fill all the mandatory information, click on save and continue button. That’s it. The registration is over. You will receive an email that contains your partner id, license key and SDK. The SDK available in a zipped format, contains the terms of use and documentation about the services available. Other than this the SDK includes the logos and icons required to display the weather information. As per the SDK, currently there are 2 types of information available through XOAP. These services are Current Conditions for over 30,000 U.S. and over 7,900 international Location IDs Updated at least Hourly Five-Day Forecast (today + 4 additional forecast days in consecutive order beginning with tomorrow) for over 30,000 U.S. and over 7,900 international Location IDs Updated at least Three Times Daily The SDK provides detailed information about the fields included in response of each service. Additionally there is a refresh rate that you need to comply with. As per the SDK, the refresh rate means the following “Refresh Rate” shall mean the maximum frequency with which you may call the XML Feed for a given LocID requesting a data set for that LocID. During the time period in between refresh periods the data must be cached by you either in the memory on your servers or in Your Desktop Application. About the Services Weather.com will provide you with access to the XML Feed over the Internet through the hostname xoap.weather.com. The weather data from the XML feed must be requested for a specific location. So you need a location ID (LOC ID). The XML feed work with 2 types of location IDs. First one is with City Identifiers and second one is using 5 Digit US postal codes. If you do not know your location ID, don’t worry, there is a location id search service available for you to retrieve the location id from city name. Since I am a resident in the Kingdom of Bahrain, I am going to retrieve the weather information for Manama(the capital of Bahrain) . In order to get the location ID for Manama, type the following URL in your address bar. http://xoap.weather.com/search/search?where=manama I got the following XML output. <?xml version="1.0" encoding="UTF-8"?> <!-- This document is intended only for use by authorized licensees of The –> <!-- Weather Channel. Unauthorized use is prohibited. Copyright 1995-2011, –> <!-- The Weather Channel Interactive, Inc. All Rights Reserved. –> <search ver="3.0">       <loc id="BAXX0001" type="1">Al Manama, Bahrain</loc> </search> You can try this with any city name, if the city is available, it will return the location id, and otherwise, it will return nothing. In order to get the weather information, from XOAP,  you need to pass certain parameters to the XOAP service. A brief about the parameters are as follows. Please refer SDK for more details. Parameter name Possible Value cc Optional, if you include this, the current condition will be returned. Value can be anything, as it will be ignored e.g. cc=* dayf If you want the forecast for 5 days, specify dayf=5 This is optional iink Value should be XOAP par Your partner id. You can find this in your registration email from weather.com prod Value should be XOAP key The license key assigned to you. This will be available in the registration email unit s or m (standard or matric or you can think of Celsius/Fahrenheit) this is optional field, if not specified the unit will be standard(s) The URL host for the XOAP service is http://xoap.weather.com. So for my purpose, I need the following request to be made to access the XOAP services. http://xoap.weather.com/weather/local/BAXX0001?cc=*&link=xoap&prod=xoap&par=*********&key=************** (The ***** to be replaced with the corresponding alternatives) The response XML have a root element “weather”. Under the root element, it has the following sections <head> - the meta data information about the weather results returned. <loc> - the location data block that provides, the information about the location for which the wheather data is retrieved. <lnks> - the 4 promotional links you need to place along with the weather display. Additional to these 4 links, there should be another link with weather channel logo to the home page of weather.com. <cc> - the current condition data. This element will be there only if you specify the cc element in the request. <dayf> - the forcast data as you specified. This element will be there only if you specify the dayf in the request. In this walkthrough, I am going to capture the weather information for Manama (Location ID: BAXX0001). You need 2 applications to display weather information in your website. A Console application that retrieves data from the XMLOAP and store in the SQL Server database (or any data store as you prefer).This application will be scheduled to execute in every 25 minutes using windows task scheduler, so that we can comply with the refresh rate. A web application that display data from the SQL Server database Retrieve the Weather from XOAP I have created a console application named, Weather Service. I created a SQL server database, with the following columns. I named the table as tblweather. You are free to choose any name. Column name Description lastUpdated Datetime, this is the last time when the weather data is updated. This is the time of the service running TemparatureDateTime The date and time returned by XML feed Temparature The temperature returned by the XML feed. TemparatureUnit The unit of the temperature returned by the XML feed iconId The id of the icon to be used. Currently 48 icons from 0 to 47 are available. WeatherDescription The Weather Description Phrase returned by the feed. Link1url The url to the first promo link Link1Text The text for the first promo link Link2url The url to the second promo link Link2Text The text for the second promo link Link3url The url to the third promo link Link3Text The text for the third promo link Link4url The url to the fourth promo link Link4Text The text for the fourth promo link Every time when the service runs, the application will update the database columns from the XOAP data feed. When the application starts, It is going to get the data as XML from the url. This demonstration uses LINQ to extract the necessary data from the fetched XML. The following are the code segment for extracting data from the weather XML using LINQ. // first, create an instance of the XDocument class with the XOAP URL. replace **** with the corresponding values. XDocument weather = XDocument.Load("http://xoap.weather.com/weather/local/BAXX0001?cc=*&link=xoap&prod=xoap&par=***********&key=c*********"); // construct a query using LINQ var feedResult = from item in weather.Descendants() select new { unit = item.Element("head").Element("ut").Value, temp = item.Element("cc").Element("tmp").Value, tempDate = item.Element("cc").Element("lsup").Value, iconId = item.Element("cc").Element("icon").Value, description = item.Element("cc").Element("t").Value, links = from link in item.Elements("lnks").Elements("link") select new { url = link.Element("l").Value, text = link.Element("t").Value } }; // Load the root node to a variable, you may use foreach construct instead. var item1 = feedResult.First(); *If you want to learn more about LINQ and XML, read this nice blog from Scott GU. http://weblogs.asp.net/scottgu/archive/2007/08/07/using-linq-to-xml-and-how-to-build-a-custom-rss-feed-reader-with-it.aspx Now you have all the required values in item1. For e.g. if you want to get the temperature, use item1.temp; Now I just need to execute an SQL query against the database. See the connection part. using (SqlConnection conn = new SqlConnection(@"Data Source=sreeju\sqlexpress;Initial Catalog=Sample;Integrated Security=True")) { string strSql = @"update tblweather set lastupdated=getdate(), temparatureDateTime = @temparatureDateTime, temparature=@temparature, temparatureUnit=@temparatureUnit, iconId = @iconId, description=@description, link1url=@link1url, link1text=@link1text, link2url=@link2url, link2text=@link2text,link3url=@link3url, link3text=@link3text,link4url=@link4url, link4text=@link4text"; SqlCommand comm = new SqlCommand(strSql, conn); comm.Parameters.AddWithValue("temparatureDateTime", item1.tempDate); comm.Parameters.AddWithValue("temparature", item1.temp); comm.Parameters.AddWithValue("temparatureUnit", item1.unit); comm.Parameters.AddWithValue("description", item1.description); comm.Parameters.AddWithValue("iconId", item1.iconId); var lstLinks = item1.links; comm.Parameters.AddWithValue("link1url", lstLinks.ElementAt(0).url); comm.Parameters.AddWithValue("link1text", lstLinks.ElementAt(0).text); comm.Parameters.AddWithValue("link2url", lstLinks.ElementAt(1).url); comm.Parameters.AddWithValue("link2text", lstLinks.ElementAt(1).text); comm.Parameters.AddWithValue("link3url", lstLinks.ElementAt(2).url); comm.Parameters.AddWithValue("link3text", lstLinks.ElementAt(2).text); comm.Parameters.AddWithValue("link4url", lstLinks.ElementAt(3).url); comm.Parameters.AddWithValue("link4text", lstLinks.ElementAt(3).text); conn.Open(); comm.ExecuteNonQuery(); conn.Close(); Console.WriteLine("database updated"); } Now click ctrl + f5 to run the service. I got the following output Check your database and make sure, the data is updated with the latest information from the service. (Make sure you inserted one row in the database by entering some values before executing the service. Otherwise you need to modify your application code to count the rows and conditionally perform insert/update query) Display the Weather information in ASP.Net page Now you got all the data in the database. You just need to create a web application and display the data from the database. I created a new ASP.Net web application with a default.aspx page. In order to comply with the terms of weather.com, You need to use Weather.com logo along with the weather display. You can find the necessary logos to use under the folder “logos” in the SDK. Additionally copy any of the icon set from the folder “icons” to your web application. I used the 93x93 icon set. You are free to use any other sizes available. The design view of the page in VS2010 looks similar to the following. The page contains a heading, an image control (for displaying the weather icon), 2 label controls (for displaying temperature and weather description), 4 hyperlinks (for displaying the 4 promo links returned by the XOAP service) and weather.com logo with hyperlink to the weather.com home page. I am going to write code that will update the values of these controls from the values stored in the database by the service application as mentioned in the previous step. Go to the code behind file for the webpage, enter the following code under Page_Load event handler. using (SqlConnection conn = new SqlConnection(@"Data Source=sreeju\sqlexpress;Initial Catalog=Sample;Integrated Security=True")) { SqlCommand comm = new SqlCommand("select top 1 * from tblweather", conn); conn.Open(); SqlDataReader reader = comm.ExecuteReader(); if (reader.Read()) { lblTemparature.Text = reader["temparature"].ToString() + "&deg;" + reader["temparatureUnit"].ToString(); lblWeatherDescription.Text = reader["description"].ToString(); imgWeather.ImageUrl = "icons/" + reader["iconId"].ToString() + ".png"; lnk1.Text = reader["link1text"].ToString(); lnk1.NavigateUrl = reader["link1url"].ToString(); lnk2.Text = reader["link2text"].ToString(); lnk2.NavigateUrl = reader["link2url"].ToString(); lnk3.Text = reader["link3text"].ToString(); lnk3.NavigateUrl = reader["link3url"].ToString(); lnk4.Text = reader["link4text"].ToString(); lnk4.NavigateUrl = reader["link4url"].ToString(); } conn.Close(); } Press ctrl + f5 to run the page. You will see the following output. That’s it. You need to configure the console application to run every 25 minutes so that the database is updated. Also you can fetch the forecast information and store those in the database, and then retrieve it later in your web page. Since the data resides in your database, you have the full control over your display. You need to make sure your website comply with weather.com license requirements. If you want to get the source code of this walkthrough through the application, post your email address below. Hope you enjoy the reading.

    Read the article

  • Is there a search engine that indexes source code of a web-page?

    - by Dexter
    I need to search the web for sites that are in our industry that use the same Adwords management company, to ensure that the said company is not violating our contract, as they have been accused of doing. They use a tracking code in the template of every page which has a certain domain in the URL, and I'm wondering if it's possible "Google" the source code using some bot that crawls the code rather than the content? For example, I bought an unlimited license for an image gallery, and I was asked to type the license number in a comment just before the script. I thought it was just so a human could look at the source and find out if someone paid, but it turned out that it was actually that they had a crawler looking for their source code and that comment. If it ran across the code on your site, it would look for the comment, and if it found one, it would check to see if it was an existing one. If not, it would first notify you of your noncompliance, and then notify the owner of the script. Edit: I'm looking to index HTML and JavaScript only, not the server-side languages or Java.

    Read the article

  • Alternative to web of trust

    - by user23950
    Are there any alternatives to web of trust for chrome and firefox. Because I found out that Wot doesn't always ask you if you want to access a dangerous site or not. While I was browsing a while ago for a curriculum vitae template. I saw this image on google that looks like one. I click it but then it brought me to a site with a red mark in Wot, and wot doesn't even bother to inform me first that the site is dangerous. Do you know of any alternatives?

    Read the article

  • VMware vSphere cluster design for site redundancy

    - by Stefan Radovanovici
    I have a question about the best design for site redudancy when using vSphere clusters. A bit of background info about our situation first though. We are a medium-sized company with two main offices, located in different countries. Our networks are linked by a Layer2 150Mbps leased line which is currently underused. We have a variety of services running for internal use within the company, some on physycal servers and some on existing vSphere clusters. In our department we also run several services (almost all running under various forms of Linux) like NTP, Syslog, jump servers, monitoring servers and so on. We have now the requirement that those servers need to be redundant within each location (which they are not at the moment) and also site redudant (which they are to some extent, the servers are duplicated in the 2nd location with configurations kept in sync via various methods at the application layer). There is no SAN available for us, at least not something that we can use at the moment. Cost is also an issue. While we do have some budget available for this, we can't afford to buy SANs for both locations for example. I looked at the VSA feature and it seems that this could be something for us but I am unsure how to solve the site-redudancy requirement. At the moment for testing purposes I am setting up in a lab a vSphere 5 with VSA on two ESXi hosts. I am currently using the Essentials Plus kit with VSA license, which allows me to build a VSA cluster on up to 3 hosts, together with a vCenter license to manage them. The hosts each have two dual-port network cards and two 600GB drives, running in Raid1. Hardware-wise this will be enough for us to run the all the services we need as VMs and will provide redundandcy within the site. At the moment I see only two option to have site redundancy: build an identical VSA cluter in the second location and keep the various services sync'ed at application layer (database sync, rsync and so on). simply move one of the hosts from the existing cluster to the second location, basically having the VSA cluster span the 150Mbps link between the sites. I would very much prefer the second option but I am unsure how well it'll work, if it can work at all. Technically it should, we can span the needed VLANs across the leased line and have them available in the second location. The advantage would be that we don't need to worry at all about sync'ing databases and the like. But I have the feeling that the bandwidth will not be enough, I have no way of knowing how much traffic will the VSA cluster generate between the hosts. I realize that this will most likely depend on the individual usage of the VMs but still, I have no idea how VSA replicates data between the ESXi hosts. Are these my only options or can my goals be achieved in some other way ? Is there perhaps a way to have some sort of "cold stand by" cluster in the second location where the VMs would be sync'ed once per night from the main location ? The idea is that in case the first site becomes unavailable, we would be able to bring all those VMs online there. We would be ok with the data being 1 day old. Any answers are appreciated. Best regards, Stefan

    Read the article

  • running asp.net 3.5 and asp.net 2.0 in same site

    - by cori
    We're running ASP.Net 2.0 on our corporate web site, and I'd like to get it up to ASP.Net 3.5 as smoothly as possible. The project/solution architecture in VS 2005 is an ASP.Net 2.0 web project and an .Net 2.0 data access layer project which is used by the site code. Upon opening the projects in a new VS 2008 solution they seemed to be converted to .Net 3.5 with a minimum of fuss - they built correctly out of the box, deployed successfully, and seem to work just fine, which is exactly as I would expect given that .Net 2.0 and 3.5 share a common runtime. The major difference after the conversion is that the web.config file's referenced dlls are now the 3.5 versions. What I would like to do is to update the site piecemeal; as I make modifications to a given page send the 3.5 verson of that page over to our webserver and not update the whole site at once. In testing on our dev box this approach seems to be working fine - the site code is interacting with the .Net 3.5 data access layer without difficulty, a handful of pages are running 3.5 page-behind code (by this I mean that they're running assemblies built in VS 2008 - the site is using single-page assemblies for code behind), the 3.5 web.config is in place, and the bulk of the site is running code-behind assemblies built in VS2005. Everything looks great. Which makes me worried that I'm missing something. Is this architecture workable, or is there a problem lying is wait for m that I haven't considered?

    Read the article

  • Cannot access a very specific site from my router

    - by DJDarkViper
    This is a problem for me because this site is important to me. It's MY website. And sadly my email is hosted on my site (which I cant access either) When I try to access my website when connected to my Linksys E3000 router, these days it simply just doesn't go through. When I ping it, its all Request Timed Out, and when I tracert C:\Users\Kyle>tracert blackjaguarstudios.com Tracing route to blackjaguarstudios.com [199.188.204.228] over a maximum of 30 hops: 1 <1 ms <1 ms <1 ms CISCO26565 [192.168.1.1] 2 16 ms 15 ms 11 ms 11.4.64.1 3 11 ms 9 ms 11 ms rd1cs-ge1-2-1.ok.shawcable.net [64.59.169.2] 4 20 ms 21 ms 22 ms 66.163.76.98 5 37 ms 36 ms 35 ms rc1nr-tge0-9-2-0.wp.shawcable.net [66.163.77.54] 6 112 ms 84 ms 85 ms rc2ch-pos9-0.il.shawcable.net [66.163.76.174] 7 86 ms 89 ms 90 ms rc4as-ge12-0-0.vx.shawcable.net [66.163.64.46] 8 90 ms 84 ms 85 ms eqix.xe-3-3-0.cr2.iad1.us.nlayer.net [206.223.115.61] 9 97 ms 97 ms 99 ms xe-3-3-0.cr1.atl1.us.nlayer.net [69.22.142.105] 10 128 ms 128 ms 126 ms ae1-40g.ar1.atl1.us.nlayer.net [69.31.135.130] 11 101 ms 97 ms 96 ms as16626.xe-2-0-5-102.ar1.atl1.us.nlayer.net [69.31.135.46] 12 100 ms 97 ms 197 ms 6509-sc1.abstractdns.com [207.210.114.166] 13 * * * Request timed out. 14 * * * Request timed out. 15 * * * Request timed out. 16 * * * Request timed out. 17 * * * Request timed out. 18 * * * Request timed out. 19 * * * Request timed out. 20 * * * Request timed out. 21 * * * Request timed out. 22 * * * Request timed out. 23 * * * Request timed out. 24 * * * Request timed out. 25 * * * Request timed out. 26 * * * Request timed out. 27 * * * Request timed out. 28 * * * Request timed out. 29 * * * Request timed out. 30 * * * Request timed out. Trace complete. C:\Users\Kyle> SHAW Cable being my ISP. Figuring this was all something to do with some setting I made on the router, I reset the thing back to factory defaults. Nope. So I'm at a bit of a loss what to do here, as NO device (Computers, Laptops, Tablets, Phones, PS3/ 360, etc) can access my site or its features, so it's not just my computer either. But every other site is just fine. When I connect to my neighbors router, the site comes up just fine. And shes with SHAW as well. What should I do?!

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >