Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 222/389 | < Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >

  • How to recover organic position in Google results after server down?

    - by ElHaix
    I have several sites that were doing quite well in terms of organic SEO rankings. I have the important sites setup in Google's Webmaster tools. Long story short, the system was down for about two weeks. Now in AdSense and Analytics, I am seeing that the page views are SLOWLY increasing. and I would like to know if there is anything I can do now to try to expedite the process of regaining those positions. Since there were several errors from that server, is it possible that Google will now rank any site from that IP address lower due to those two weeks of errors? Is this something that I just have to let ride out? Thanks.

    Read the article

  • Rails backend: debugging [closed]

    - by banditKing
    I have a rails -API app with Rabl. Im trying to build a photo sharing app. Im getting status 500 codes when my client communicates with the server. Im trying to find out how to debug this. The client is an iOS app I wrote. Where should I begin the debugging process and what are the best tools for debugging rails-api backend apps. Im new to server development so trying to learn the tricks of the trade. Any help would be appreciated. Thanks

    Read the article

  • Google Analytics Not tracking data correctly IP-address issue?

    - by PaperThick
    I have developed a small site for a client and the site has been placed inside a <iframe> at the clients site. The GA-script I'm using looks like this: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push( ['_setAccount', 'UA-XXXXXXXX-2'], //My company's GA-account ['_trackPageview'], ['b._setAccount', 'UA-XXXXXXXX-1'], // Test GA-account ['b._trackPageview'], ['th._setAccount', 'UA-XXXXXXX-3'], ['th._setDomainName', '.clientdomain.se'], // Client GA-account ['th._trackPageview'] ); (function () { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> </head> As you can see I report the GA pageviews to the client as well. The GA script is tracking visitors and pageviews at both ends. But the problem is that on my clients side the visitor-count is more than double what they are on my end (20 000 vs 5 000). At first I thought that it was being duplicated at some point but when I checked my Crazy-Egg account I saw that it had tracked over 10 000 visits and then stopped tracking because that was my limit on the account. The page my site is on is on a IP-address (http://XXX.XXX.XX.X/campaign/) and not on a "valid url". Could that be an issue why some of the visitors isn't beeing tracked? Thanks in advance

    Read the article

  • Length of Page Title, URL, Meta Description and total number of links on a page

    - by MJWadmin
    We've been examining a number of different SEO tools recently. Several of these tell us that some of our page title's, urls and meta descriptions are too long. We've also been told that some of our pages have too many links on them. I guess our first question is - is any of that feedback true! Can URL's etc actually be too long and if so how much does this affect ranking? Secondly can you have too many links on a page and if so, how many is too many? Thanks in advance...

    Read the article

  • Should I add rel nofollow to internal links which already have meta noindex?

    - by CamSpy
    Let's say I have a products page with listing producsts and the page has pagination. I would like the 1st page to have all the SE ranking weight so I decided to put meta noindex on the rest of the paginated pages (from page 2 to N). My common sense says that if I don't want pages to not get indexed, I shouldn't also pass link/PR juice to these pages. (Is that smart?) What happens if I set rel="nofollow" for all pagination links from page 2 to page N?

    Read the article

  • Most common Apache and PHP configuration for portable Web Applications

    - by Mahan
    I always create web application using PHP but I always distribute and deploy my works to different kinds of server platforms and web server configurations. Thus I always encounter problems in deployment because some features are enabled and others are disabled. And my question, is there a standard web server configuration that is commonly used by most of web servers worldwide? covering the aspects of reliability, security and maintainability?

    Read the article

  • E-Commerce Website

    - by haargott
    I am planning to create an e-commerce website for users to buy products and services. In this website I want users to register and also participate in something like a browser game, where every user may receive some questions which they have to answer. For each question they successfully answered, they receive points and the number of collected points will decide on which rank they are. Edit 2 Currently I am considering using only HTML, CSS, JavaScript, PHP, SQL to design this e-commerce website. Together with this I was thinking about learning jQuery as it may help me, but I am not sure if I should code everything specifically by myself or just use the library to make it faster. 1) Could you tell me if those languages are sufficient enough for creating such a website described? 2) Could you tell me what kind of free software tools and frameworks are most appropriate to use when creating this e-commerce website?

    Read the article

  • IIS and content caching

    - by JayC
    I'm a web developer and administer of a Windows 2008R2 Could Instance with IIS 7. I recently made an update to our website, but when I revisited the website, the website was being viewed with old stylings. I did a refresh (shift + reload button in Firefox) and of course the website displayed as it should. I didn't worry about it, until my client had the same issue in Safari. So, my question, in general, is, how do I prevent this from happening again, and yet still afford some caching of our site? I noticed we did not have content expiration set up on our webserver sites, so I've set that up, but did I really need to? I've also looked at Etags, and, honestly, it's hard for me to know whether or not I should use them or not. One comment I read somewhere there isn't really any issue with Etags scenarios in IIS (even in webfarms)... but, I dunno. Anybody have any suggestions, links, info? Thanks.

    Read the article

  • How do i get this to work? [closed]

    - by user1867842
    This is my code.... it speaks for itself. <?php define("html","<html>"); define("htmlEnd","</html>"); etc... etc... ?> What i'm trying to do is make a wrapper for html's tags so they won't be needed anymore. But i can't get any of the attributes for html elements to be defined in php. This again speaks for itself... i don't know any other way of saying this... i guess how would i make a other mark up language like html without any tags but still keep everything about html is what i'm trying to say.....

    Read the article

  • Recomendation for Webshop with API

    - by m.sr
    I'm searching for a webshop. The problem with my search is, that the webshop-software of my choice needs to have a useabel API or some interface for external applications. E.g. i need to place orders by an external application or need to get product descriptions or warehouse stock from the external application. I somehow would like to have a webshop wehere the webinterface is just one way to interact with the whole system. There are some other requirments, which have to be fullfilled, but i guess they are kind of common: running on linux MySQL (we already have MySQL-replication and backup in place) i like open source but i'm willing to pay for it, if it's worth it I found some webshops on the net - but perhaps you can tell me, if theres any hope for a webshop with a good API before i go and test all of them, on the first look i didn't find any docs about any interface to external applications for any of my search results. Thank you!

    Read the article

  • Will we be penalized for having multiple external links to the same site?

    - by merk
    There seem to be conflicting answers on this question. The most relevant ones seem to be at least a year or two old, so I thought it would be worth re-asking this question. My gut says it's ok, because there are plenty of sites out there that do this already. Every major retailer site usually has links to the manufacturer of whatever item they are selling. go to www.newegg.com and they have hundreds of links to the same site since they sell multiple items from the same brand. Our site allows people to list a specific genre of items for sale (not porn - i'm just keeping it generic since I'm not trying to advertise) and on each item listing page, we have a link back to their website if they want. Our SEO guy is saying this is really bad and google is going to treat us as a link farm. My gut says when we have to start limiting user useful features to our site to boost our ranking, then something is wrong. Or start jumping through hoops by trying to hide text using javascript etc Some clients are only selling 1 to a handful of items, while a couple of our bigger clients have hundreds of items listed so will have hundreds of pages that link back to their site. I should also mention, there will be a handful of pages with the bigger clients where it may appear they have duplicate pages, because they will be selling 2 or 3 of the same item, and the only difference in the content of the page might just be a stock #. The majority of the pages though will have unique content. So - will we be penalized in some way for having anywhere from a handful to a few hundred pages that all point to the same link? If we are penalized, what's the suggested way to handle this? We still want to give users the option to go to the clients site, and we would still like to give a link back to the clients site to help their own SE rankings.

    Read the article

  • How to use database to generate multiple folder content page? [migrated]

    - by VenomVipes
    Scenario :I am trying to build a Mobile Entertainment Portal. It will enable users to download Music & Movies to their Cell Phones... Problem Exp : Suppose I upload 100 folders of Songs, each folder is for one Album. I want a way to generate a page with all the folders name (Album Name) in it. If user click on the page, they should be taken to a page where they get list of all songs in the album. Clicking on any song name will let them download it. Can it be done anyway or will I have to manually design each of the 3 pages for each album. If I do that, its time consuming and also will be difficult to change anything like footer, header...

    Read the article

  • Django - need to split a table across multiple locations [closed]

    - by MikeRand
    Hi all, I have a Django project to track our company's restructuring projects. Here's the very simple model: class Project(models.Model): code = models.CharField(max_length=30) description = models.CharField(max_length=60) class Employee(models.Model): project = models.ForeignKey(Project) employee_id = models.IntegerField() country_code = models.CharField(max_length=3) severance = models.IntegerField() Due to regulations in some European countries, I'm not allowed to keep employee-level severance information in a database that sits on a box outside of that country. In Django, how do I manage the need to have my Employee table split across multiple databases based on an Employee attribute (i.e. country_code) in a way that doesn't impact anything else in the project (e.g. views, templates, admin)? Thanks, Mike

    Read the article

  • IIS to parse php in a .dll files

    - by Agony
    The .dll files ain't the dynamic link library. That's what the client side software calls for (cannot change). Its essentially a php script that should run and return specific values. However currently it simply downloads it and that results in a failure. That's what it results in on a Apache server: [Update] NewVersion=1 UpdateFileNumber=1 UpdateFile1=update1/LPServerInfo.dat ServerNumber=1 Server1=http://88.159.116.217/ here it is on IIS: 198.24.133.74:8080/update.dll?0 renaming it to php works fine for testing - it runs and returns values. I edited the MIME and set .dll to application/x-httpd-php but that doesn't seem to work in IIS. Any solutions?

    Read the article

  • Tuning WebServer Response -

    - by Vedran Wex Maricevic
    I have this sam e question on StackOverflow and I was advised to ask it here hoping for more information. Here is the question: I am in rather unfavorable situation. I have aspdotneststore front e-commerce application and search addon called VibeTrib. I dont have source code for both of those. Store that runs on StoreFront and VibeTrib has close to 250k products. Also we have lots of filters. I spoke to ViTrib reps, and they want extra money so they could optimize Queries that they use. Money they require is nto a big deal, but the problem is I dont trust them anymore. What we got is much different then wha is being advertised. To cut the long story short. I am runing the store on Amazon AWS now, and regardless of what DB (MsSQL 2012) server I set (I tried 32GB RAM monsters instances) it is slow. Ajax search uses Full Text search and it displays search keywords relatively fast, but once the search is performed ( to display all results) it is still slow.!!! There is something that I could to do accelerate the speed on my own end? I do have full control over EC2 Instance (Web server Server 2012 and IIS 8). Can I set IIS to step in for the search and cache some of it? I was hoping to cache at least some most common words. My best bet is IIS 8 :) Is there any help in my case? Thanks

    Read the article

  • Comments Application SEO

    - by user1015448
    I am developing a commenting application. Users will be able to integrate this application in Blogs. I am unsure how to make the comments searchable in Search Engines. What I want is all the comments which are being posted should be included in Search Engine results when searched with relevant keywords. Please give me some hint how to do this. Do I need to use meta tags ? If so, how should I create them?

    Read the article

  • Why do spammers use CELESTRON NEXTAR 6SE?

    - by fmz
    I am running a website for a volunteer organization that hosts an annual event. There is a form where people can volunteer to bring items for the event. All too frequently I get spam from users across the globe that enter things like this: Country - 1: Australia Material - 1: CELESTRON NEXTAR 6SE Country - 2: Australia Material - 2: C8 Newton Country - 3: Australia Material - 3: ETX 125EC Country - 4: Australia Material - 4: ETX 125EC Country - 5: Australia Material - 5: CELESTRON NEXTAR 6SE I don't really care about the country, but what is it with the telescope stuff? Is there some hidden meaning behind all this or is it some astronomy group that moonlights as spammers?

    Read the article

  • What is the best managed VPS Hosting as far as Performance, Cost, and Customer service? [closed]

    - by Scotty
    Possible Duplicate: How to find web hosting that meets my requirements? I'm currently using inmotionhosting which is great in all of the category's listed in this questions title except for the cost. I'm on a tight budget and am looking for something a little more affordable while still have great performance and Customer service. I prefer linux and an affiliate program would also be a huge plus. Any recommendations?

    Read the article

  • Re-Route Mail to a port other than 25

    - by Ken
    Is there a way to route mail to another port? I have an email account attached to my laptop that I'd like to be able to send and receive mail from. Due to mobility, I'll be passing through various networks that will probably block this port. My dynamic DNS provider allows me to utilize web-forwards for MX domains; is this possible? where I can web forward to a domain:port which is managed by my DNS provider when I traverse between networks. If not, is there a way? Of course i could use web-mail or relay-forwarding from my home server, but that's not geeky enough.

    Read the article

  • SEO - Hidden content before main site content

    - by 0pt1m1z3
    I have a two hidden divs before my main site content, one with the login form and another with the signup form. I then have login and signup buttons within the page that use JQuery to show or hide these divs. I like the effect this setup offers, dropping down from the top of the page and pushing the rest of the content down. However, recently I have been getting serious about SEO and I am wondering if these divs have been affecting my SERP rankings. Basically, every non-logged page (everything bots see) has the same two display:none; divs at the top of the document flow. Is it bad? Should I re-engineer these forms and the way they are displayed?

    Read the article

  • How can I have more clicks than page views in AdSense

    - by ArcticLlama
    One of my AdSense ad units (in the new beta interface) occasionally says that I have more clicks than page views which gives a CTR of over 100%. Does anyone know how this happens? I'm assuming it has something to do with when a page view is recorded, versus when someone clicks, but it happens regularly enough (on a daily report) that it can't just be that a bunch of users click an ad before the page displays fully.

    Read the article

  • Combining a content management system with ASP.NET

    - by Ek0nomik
    I am going to be creating a site that seems like it requires a blend of a content management system (CMS) and some custom web development (which is done in ASP.NET MVC). I have plenty of web development experience to understand the ASP.NET MVC side of the fence, but, I don't have a lot of CMS knowledge aside from getting one stood up. Right now my biggest question is around integrating security from ASP.NET with the CMS. I currently have an ASP.NET MVC site that handles the authentication for multiple production sites and creates an authentication cookie under our domain (*.example.com). The page acts like a single sign on page since the cookie is a wildcard and can be used in any other applications of the same domain. I'd really like to avoid having users put in their credentials twice. Is there a CMS that will play well with the ASP.NET Forms Authentication given how I have these existing applications structured? As an aside, right now I am leaning towards Drupal, but, that isn't finalized.

    Read the article

  • Website Remodel Redirects

    - by inKit
    We've recently built a site for a new client who has not inserted all the content that they had from their old site into their new one. Also a lot of content is dynamic with ID's not matching from the old site to the new one. We have added dynamic redirects for most of the patterns we could find in pages that were 404ing, but there are still a lot of pages that had content, or just jumbled urls that we cannot match up with content pages on the new site. Is it better to redirect these leftover pages to the homepage? Or leave them 404ing?

    Read the article

  • Google Analytics: understanding dimensions and metrics?

    - by flossfan
    If I run a query on the Google Analytics API and set the dimension to ga:pagePathLevel1 and the metric to ga:avgTimeOnPage, I get results like this: { pagePathLevel1: /about, avgTimeOnPage: 28 }, { pagePathLevel1: /contact, avgTimeOnPage: 10 } I'm not completely sure how to interpret this. Is the value of avgTimeOnPage the average time spent by any user on all pages that match that path? Or is 28 seconds the average time spent by any user on any single page that matches that path? I'm looking for the average time spent across all pages matching that path, but the time estimates look shorter than I'd expect. I hope that question makes sense! Please tell me if it doesn't.

    Read the article

  • Bayesian content filter for vbulletin [on hold]

    - by mc0e
    I've been tasked with coming up with a tool to automatically flag some posts for moderator attention on a large vbulletin forum. It's not spam per se, but the task has a lot in common with the sort of handling that might be done by a spam protection plugin (a mod in vbulletin speak). There's only so much I can say, but the task does not involve bad users, so much as particular kinds of posts which the moderators need to be aware of. Filtering out user registrations and links is therefore not useful, and we are talking about posts by real human users. What I'm looking for is an existing bayesian classification plugin, or something that I can study to get an understanding of how to do the vbulletin side of the interface in order to build such a thing. Ie I'd need ways for moderators to list flagged posts, and to correct the classification of posts which have been mis-classified. Ideally I want a 3 way split with an "unsure" category in order to reduce what has to be reviewed to find any mis-classifications. Any pointers? I've searched around a bit, and so far what I've found has been more or less entirely targetted at intervening in sign-ups (mostly using stopforumspam), captchas, and use of external services like akismet which are spam specific. I'm also considering an external solution, which might be ableto be interfaced i

    Read the article

< Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >