Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 220/389 | < Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >

  • Will keep google traffic on new site from old site when moving content from old site? [closed]

    - by user1324762
    Possible Duplicate: new domain, old links are 301’d from old domain to new, how will this affect my rankings? I have a site about bikers. Now I created a dating site for bikers. I don't need old site any more, I want to move all articles to this new dating site. So basically, this is not only moving content to new domain, but also to entire new site. What I am planning to do is to make 301 redirect for all 200 articles. For pages that are not articles, I will just put message that the site will be down soon. Do you think that I will get all google traffic from old site from those articles? Is there anything I should be aware and careful?

    Read the article

  • Serve up syntactic XHTML5 using the text/html MIME type?

    - by cboettig
    I have a site currently written with HTML5 tags. I'd like to be able to parse the site as XML, with support for namespaces, etc, to facilitate programmatic extraction of data. Currently I have <!DOCTYPE html> and <meta charset="utf-8"> Which I gather is equivalent in HTML5 to explicitly setting the content-types as <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> for my current setup. In order to serve XML it sounds like the right thing to do is <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> Should I also change my Content-Type to <meta http-equiv="content-type" content="application/xhtml+xml; charset=iso-8859-1" /> Or is that not necessary? What is the advantage of having content-type be "application/xhtml+xml"? What is the disadvantage? (Sounds like it may break internet explorer rendering of the site? but maybe that information is out of date now?) Many thanks!

    Read the article

  • How should I setup billing for AdWords when managing a client's campaign in My Client Center? [closed]

    - by Dustin
    I have worked with Google AdWords before and will now be managing an AdWords account for a client. I have a My Client Center account, but I'm wondering what the best practices are for billing. Should I link billing to my own credit card and then have the client pay me (they have to pay me to manage the account anyway), or should I have the client pay Google directly? How is this usually done? If it is the later, what is the best way to have them input their payment info?

    Read the article

  • Why do Google search results include pages disallowed in robots.txt?

    - by Ilmari Karonen
    I have some pages on my site that I want to keep search engines away from, so I disallowed them in my robots.txt file like this: User-Agent: * Disallow: /email Yet I recently noticed that Google still sometimes returns links to those pages in their search results. Why does this happen, and how can I stop it? Background: Several years ago, I made a simple web site for a club a relative of mine was involved in. They wanted to have e-mail links on their pages, so, to try and keep those e-mail addresses from ending up on too many spam lists, instead of using direct mailto: links I made those links point to a simple redirector / address harvester trap script running on my own site. This script would return either a 301 redirect to the actual mailto: URL, or, if it detected a suspicious access pattern, a page containing lots of random fake e-mail addresses and links to more such pages. To keep legitimate search bots away from the trap, I set up the robots.txt rule shown above, disallowing the entire space of both legit redirector links and trap pages. Just recently, however, one of the people in the club searched Google for their own name and was quite surprised when one of the results on the first page was a link to the redirector script, with a title consisting of their e-mail address followed by my name. Of course, they immediately e-mailed me and wanted to know how to get their address out of Google's index. I was quite surprised too, since I had no idea that Google would index such URLs at all, seemingly in violation of my robots.txt rule. I did manage to submit a removal request to Google, and it seems to have worked, but I'd like to know why and how Google is circumventing my robots.txt like that and how to make sure that none of the disallowed pages will show up in their search results. Ps. I actually found out a possible explanation and solution, which I'll post below, while preparing this question, but I thought I'd ask it anyway in case someone else might have the same problem. Please do feel free to post your own answers. I'd also be interested in knowing if other search engines do this too, and whether the same solutions work for them also.

    Read the article

  • Flowmotion Running on Ubuntu Server [migrated]

    - by Thomas Egan
    I am trying to configure Flowmotion to work on my Ubuntu Server. At present I use LAMP to serve pages from a Virtualbox installation. I will be moving this to a dedicated server but would like to enable true streaming of videos using this installation. I am only interested in open source streaming for a research project and although I have installed Flowmotion via apt-get I don't know how to start the service so that embedded videos located on the server will stream. Can anybody provide any information regarding this or online resources I may have missed? I have checked the documentation however if appears far too complex. Just clarify I'm running VirtualBox 4.2.1 on Mac OSX 10.6.8 and Ubuntu Server 12.06 64-bit

    Read the article

  • Does google see the output of document.write?

    - by merk
    I've got a site where people can list machinery for sale. Each item for sale has it's own dynamic page. On each of these pages we allow the person selling the item to have a link back to their own website. Some people only sell a handful of items and some people are selling dozens or hundreds of items. So in some cases we can have a 100 links back to their external site. Our SEO guy is saying this is bad (i'll open another question on that). So i was wondering if i take the links and spit them out using document.write, will that hide them from google and the other SE's ?

    Read the article

  • PHP URL Rewrite engine for small project

    - by Jens Törnell
    I use PHP. I want to setup a micro site as a prototype, where I can work with the frontend only, separated from any CMS. URL Rewrite I also want the URL rewrite to be correct, like http://www.test.com/products/tables/green/little-wood123/ Question(s) Is there any free class for URL rewriting? I searched but found none. If that is not the way to go, what framework is nice for this? It should be tiny, easy to use and support URL rewrite.

    Read the article

  • Help on PHP CURL script [closed]

    - by Sumeet Jain
    This script uses a cookie.txt in the same folder chmoded to 777... The problem i am facing is i hav many accounts to login... Say if i hav 5 accounts...i created cookie1.txt,cookie2.txt an so on.. then the script worked..with the post data But i want this to be always logged in and post data.. Can anyone tell me how to do this????? Code which works for login and post data is http://pastebin.com/zn3gfdF2 Code which i require should be something like this ( i tried with using the same cookie.txt but i guess it expires :( ) http://pastebin.com/45bRENLN Please help me with dealing with cookies... Or suggest how to modify the code without using cookie files...

    Read the article

  • How difficult it is to develope Apps for Android and iOS? [on hold]

    - by netsetter
    I'm an experienced web developer in PHP, HTML, Javascript, MySQL, CSS and I'm running communities where people can register to be able to login and do some stuff. Now more and more people are requestion an App, I told them that I have no time and experience to develope Apps with many functions for such complex communities I am running, but then the users told me what would be enough for them and this sounds already simpler to me: An app to install on Android / iOS just to be able to login (+ autologin), so they appear always online in the community when they have internet connection. Then only 1 function like a counter of new activities regarding their user account (new messages, new replies, etc..), and if they click on the app then a browser window will open to read the info at the main website. So, what you think, it will be a big thing to develope such an app for the members? Is there a big diffrence between developing for Android and iOS? How to test the App if you don't have an Android or iOS phone in example?

    Read the article

  • Paypal "Subscribe" button: Is it possible to let the subscriber set the amount?

    - by Šime Vidas
    I'm setting up a recurring payment option on my website. I'd like to have two options: Option 1 (for individuals): Fixed $6/mo subscription Option 2 (for organizations): A subscription where the amount is set by the subscriber PayPal's "Subscribe" button does not seem to allow that: When I leave the "Amount" field of the 2nd option empty, I get an error: So, is this not possible? Do all options require fixed amounts?

    Read the article

  • Is there a way to forward emails associated with a domain without a mail server?

    - by MeltingDog
    A client owns example1.com but wants to also purchase example2.com and have it point to their original site at example1.com. No problem there. But they also want any emails going to example2.com to be forwarded to their counter parts at example1.com Eg: if someone emails [email protected] it will be forwarded to [email protected] They only way I can think of doing this at the moment is to set up host for example2.com and then set up mail forwarders in cpanel. But this seems a bit excessive and costly. Does anyone know another, cheaper way of doing this?

    Read the article

  • mailing for categories

    - by nerkn
    There are more than 10 categories in my site, users can register more than one category. My php script prepare contents of each category, according to user preference my script merge those contents for each user. so every user can get what category they want. Problem is I want to send 340+ mails per hour and dreamhost dont allow. What do you suggest? I think to a service like mailchimp but I couldnt find that scenario, do they support category & content etc. Can I use smtp in dreamhost?

    Read the article

  • Visits-PageViews-Bounce Rate-New Visitors-Visit Duration (Google Analytics), which one is top priority for seo?

    - by HOY
    This is the case: My site is getting a lot of trafic from an image (a company logo image) because this image is ranked 1.st in google search results for a company's title. (I have no idea how that happened) This image is must for my website, but it is not relevant with site content so irrelevant people search for the image and finds out about my site, so that I get interesting statistics: http://postimage.org/image/3oyvrjoz9/ Pros: Total Visits & Avg. New Visits Cons: Avg. Page/Visit, Avg. Visit Duration, Bounce Rate In summary I am confused if this image is helpful to my website ? Because I don't know the balance between those 5 statistics P.S: My website is 2 months old, and we are working on seo at the moment Another P.S: Kindly ask you to not provide assumtions, because I also have assumptions, I need real knowledge. Edit: Search Keyword is: arcelik logo Search Site: google.com.tr Search URL: https://www.google.com.tr/search?hl=en&q=arcelik+logo&bav=on.2,or.r_gc.r_pw.r_qf.&bvm=bv.41524429,d.Yms&biw=1366&bih=667&um=1&ie=UTF-8&tbm=isch&source=og&sa=N&tab=wi&ei=oZIDUfutAseVswa9zYHwCw

    Read the article

  • Why the Indian link builders or SEO companies can make so many high quality links at the same time? [closed]

    - by chiba
    There are a lot of Indian SEO companies or link builders that offer a lot of high quality link. Some of them for example offer links just from "co.uk" or "French site" with high page ranks. I have heard that even the SEO companies from other countries outsource link building to India. Do they have special connections for building links ? or Do they exchange the information between another Indian companies and have a big database of the sites where they can link?

    Read the article

  • FatCow and iPage real deal?

    - by Tribbey
    I need a cheap host that I can upgrade if needed for a startup. From searching, FatCow and iPage seem reliable and inexpensive web hosting services with a Unix OS and good bandwidth + disk space. I suspect they were bought from the same company. They propose to offer unlimited bandwidth/disc space and perks like AdSense/Facebook credit however, they're packages range from 1-3 years @ 3.15 USD/mo, there isn't a monthly package but they do allow you to cancel at anytime and there servers run on wind-mill generated energy which is a plus. I was suspicious because I couldnt seem to find one negative review about them, just affiliate pages until I read a review explaining they're strict policies on copywritten data. Has anyone experience with one of these two hosts?

    Read the article

  • How to add SMS text messaging functionality to my website?

    - by jessegavin
    I want to add the ability to send reminders to people via email and SMS for specific events that they have signed up for on a web application that I am building. The email part is not difficult, but I am wondering where to find a good solution for sending SMS messages. It would also be a plus if this solution allowed two-way SMS communication with my web application so that people would be able to reply with a CONFIRM or CANCEL type of a message. Has anyone implemented something like this? Does anyone know of good tools out there? EDIT: I am realizing that this is more of a "lots of ways to skin this cat" type of question and so I changed it to community wiki.

    Read the article

  • Tuning WebServer Response -

    - by Vedran Wex Maricevic
    I have this sam e question on StackOverflow and I was advised to ask it here hoping for more information. Here is the question: I am in rather unfavorable situation. I have aspdotneststore front e-commerce application and search addon called VibeTrib. I dont have source code for both of those. Store that runs on StoreFront and VibeTrib has close to 250k products. Also we have lots of filters. I spoke to ViTrib reps, and they want extra money so they could optimize Queries that they use. Money they require is nto a big deal, but the problem is I dont trust them anymore. What we got is much different then wha is being advertised. To cut the long story short. I am runing the store on Amazon AWS now, and regardless of what DB (MsSQL 2012) server I set (I tried 32GB RAM monsters instances) it is slow. Ajax search uses Full Text search and it displays search keywords relatively fast, but once the search is performed ( to display all results) it is still slow.!!! There is something that I could to do accelerate the speed on my own end? I do have full control over EC2 Instance (Web server Server 2012 and IIS 8). Can I set IIS to step in for the search and cache some of it? I was hoping to cache at least some most common words. My best bet is IIS 8 :) Is there any help in my case? Thanks

    Read the article

  • How can I redirect all files in a directory that doesn't conform to a certain filename structure?

    - by user18842
    I have a website where a previous developer had updated several webpages. The issue is that the developer had made each new webpage with new filenames, and deleted the old filenames. I've worked with .htaccess redirects for a few months now, and have some understanding of the usage, however, I am stumped with this task. The old pages were named like so: www.domain.tld/subdir/file.html The new pages are named: www.domain.tld/subdir/file-new-name.html The first word of all new files is the exact name of the old file, and all new files have the same last 2 words. www.domain.tld/subdir/file1-new-name.html www.domain.tld/subdir/file2-new-name.html www.domain.tld/subdir/file3-new-name.html ect. We also need to be able to access the url: www.domain.tld/subdir/ The new files have been indexed by google (the old urls cause 404s, and need redirected to the new so that google will be friendly), and the client wants to keep the new filenames as they are more descriptive. I've attempted to redirect it in many different ways without success, but I'll show the one that stumps me the most RewriteBase / RewriteCond %{THE_REQUEST} !^subdir/.*\-new\-name\.html RewriteCond %{THE_REQUEST} !^subdir/$ RewriteRule ^subdir/(.*)\.html$ http://www.domain.tld/subdir/$1\-new\-name\.html [R=301,NC] When visiting www.domain.tld/subdir/file1.html in the browser, this causes a 403 Forbidden error with a url like so: www.domain.tld/subdir/file1-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name-new-name.html I'm certain it's probably something simple that I'm overlooking, can someone please help me get a proper redirect? Thanks so much in advance! EDIT I've also got all the old filenames saved on a separate document in case I need them set up like the following example: (file(1|2|3|4|5)|page(1|2|3|4|5)|a(l(l|lowed|ter)|ccept)

    Read the article

  • Point domain to new host - changed nameservers, now what?

    - by Larry
    This is driving me nuts, because I know I'm missing something simple. I've read numerous articles/posts about how to point (not transfer) your domain to a new web host. They all say to change the name server settings at your old host, so here is what I did: On old host (1and1.com) changed the name server settings to those of my new host (inmotionhosting.com) like below: Domain name : mydomain.com Name server 1: ns.inmotionhosting.com Name server 2: ns2.inmotionhosting.com ... and confirmed this is active (did it a couple days ago) This is where every post/article I've found stops. They imply this is all that needs to be done. But how does the new host know to point the domain to my account, and the directory in my account I want it work from?? There's go to be something else to be done - just pointing to the generic name servers of the new host can't be all there is to it. Thanks in advance...I'm bewildered...

    Read the article

  • How to remove a page from site without affecting google serp

    - by Savas Zorlu
    I have a travel website. Just for information purposes, I had put a weather page. Now I realize that this page is increasing my overall bounce rate; because people who are looking for the weather forecast are landing on that page and getting what they want and exiting. What is the safest method to get rid of that page? Would it hurt my google rank if I remove it completely? Or is there a better way to handle this situation? I realize that around 21 percent of my daily hits are on that page. I would have been happy if my aim was to provide weather data for the location; however, my site needs to focus on selling hotels. So I think I need to get rid of this weather page immediately. What do you think?

    Read the article

  • How to figure out recent pagrank of websites or any particular page (Homepage)

    - by rajesh.magar
    Question just comes in front because the very recent published algorithm changes by Google been affected my website traffic. And I've been wondering that my homepage page-rank is been also drop to 6 to 4 (Might be I am not sure). I am not using any supernatural SEO tools like SEOMOZ,Majesctic SEO etc. So it's quite difficult for me to ensure weather the page rank is been really affected or not. So can anyone please provide any good resource, tact or tricks to address this question. Thanks!

    Read the article

  • 404 code/header for search engines, on removed user content?

    - by mowgli
    I just got an email, from a former user on my website He was complaining that Google still shows the contact page he created on my site, even though he deleted it a month ago This is the first time in many years anyone requests this I told him, that it's almost entirely up to Google what content it wants to keep/show and for how long. If it's deleted on the site, I can't do much, other than request a re-visit from the googlebot The user-page already now says something like "Not found. The user has removed the content" TL;DR: But the question is: Should I generally add a 404 header (or other) for dynamic user content that has been removed from the site? Or could this hurt the site (SEO)?

    Read the article

  • How will the search rank get impacted if i move my mobile website to a single page application?

    - by rahul
    I have two different versions of my site. A desktop version and a mobile optimised version. That is for the same url the server renders different html for different user agents. I had been using vary header for this scheme as recommended by Google. However, now i want to move the mobile website to a single page application for several reasons. I want to know if google stops seeing anything on my mobile web version but the desktop version continues to work as it is, then how would the search rank be impacted given that mobile web gets more traffic than the desktop version. How would the vary header come jnto play

    Read the article

  • How to build a list from Postfix maillog

    - by dstonek
    I want to build a list from maillog, maillog.x containing something like Date, Sender's email, Recipient's Email and subject of the message filtering output emails and output domain. I've read about importing from spreadsheet program a cvs file. The issue is I have to add field separators in log file. I couldn't find how to customize that. How can I do that, the list and the separator? This is an example of sending mail log Jun 11 15:24:58 host postfix/cleanup[19060]: F41C660D98A0: warning: header Subject: TESTING SUBJECT from unknown[XXX.XXX.XXX.XXX]; [email protected] [email protected] proto=ESMTP helo=<[192.168.1.91] Jun 11 15:25:01 host postfix/smtp[19062]: F41C660D98A0: to=, relay=mx-rl.com[xxx.xxx.xxx.xxx]:25, delay=3.4, delays=0.66/0.01/0.86/1.9, dsn=2.0.0, status=sent (250 <538E30D9000A1DD8 Mail accepted) The list would contain the three bold fields filtering by to = [email protected]

    Read the article

  • Google web search shows dateCreated instead of dateModified metadata

    - by LonelyPixel
    So today I discovered that the pages from my website are listed with an unexpected date value. I specify the schema.org properties dateCreated and dateModified for most of my content pages. I'd expect that search results show me when a page was last updated, to get a sense of the currency of the page. But it's showing the date of first publishing which may be years ago. That's a bit unsatisfying but I don't want to misuse the metadata because Google probably reads it wrong. Some search terms for you to try it out: "gitrevisiontool"; "easyxml"; "multiselecttreeview" (look for the results on dev.unclassified.de; the human- and machine-readable dates come at the end of the page) Does anybody know more about what's wrong here? Or does it work as designed? (What a stupid design that would be.)

    Read the article

< Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >