Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 163/685 | < Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >

  • Google Analytics show zero for "Search Engine Optimizations" graph

    - by Saeed Neamati
    In Google Analytics new design, there is an area related to the queries and impressions related to your site. You can get there by following Traffic Sources = Search Engine Optimization = Queries. However, it now shows zero for the "Site Usage" graph, at the top section, while other areas of Google Analytics definitely show that site has visitors and has been used. No matter how much I search, I can't find the source of the problem. Does anyone know where the problem might be?

    Read the article

  • https:// search results appearing on Google for purely http:// site

    - by hydrurga
    I started weeding through my site's search results from Google today, using a site: search, to determine if there are any links that cause 404s and thus need redirecting. To my amazement I noticed numerous https:// results relating to various pages. My site doesn't have a SSL certificate, doesn't serve such pages, doesn't internally link to https:// pages, doesn't include any such files in its sitemap.xml and, for all of these, never has. I decided to do a Google search for https://<my site> and found one site that incorrectly refers to the root of my site with a https:// prefix - I will try to contact them to get them to correct this. I'm not sure however how Googlebot managed to index the non-root files as https://. I can't find any external links to them and surely, without certification, Googlebot should have stalled at the first request? I've just added the following lines to the site's .htaccess (although the surfer still has to navigate through the browser's "This site is a security risk. Abandon hope all ye who enter here!" message(s) first to get there): RewriteEngine On RewriteCond %{HTTPS} on RewriteRule ^(.*)$ http://www.<my site>.org/$1 [R=301,L] replacing <my site> with my domain name. My big question is this though - I would like to use the Google Webmaster Tools Remove URLs feature to remove the https:// pages from the index. Can I be guaranteed that this will only remove the https:// versions of each relevant page and not the valid http:// versions? My thanks to anyone who can help me out with this particular question and the issue in general.

    Read the article

  • how to fix bad seo after being hacked

    - by mkprogramming
    About a year ago my wordpress website was hacked & some company decided to go nuts and actually do some "SEO" on the various links it created. Some of the pages would show up on google as "payday cash advance" instead of "portfolio". The issue has been resolved, but now as I've been doing GOOD seo, I've noticed (when checking backlinks) that there are TONS of links still on the internet (mostly broken sites now) that have links to my website with titles like: "get a loan today" and so on. Is there a way to remove these links ? Can I tell google to ignore them ? Help !

    Read the article

  • 302 Redirect causes garbage at end of Wordpress link in Facebook

    - by Joao
    When I try to link my Wordpress blog to Facebook, the url doesn't resolve properly. There's garbage appended at the end and Facebook is not able to retrieve information from the site. Happens in every page, post or main entry. Here's what happens: http://clarissarezende.com.br/ shows up in Facebook as http://clarissarezende.com.br/UPLcS/ (when copy/paste the link) and no information about the site shows up in FB. I'm using Wordpress 3.3.1 with ProPhoto 4. Recently I moved the DNS entry on my ISP. The blog is hosted at clarissarezende.com.br/public_html/blog2 and before the DNS would point to public_html and then I changed it to public_html/blog2. Note that I did not move any Wordpress files. Made the (I think) necessary changes all over Facebook, but still no dice... Any ideas on what can be happening?

    Read the article

  • Reverse proxying only a specific URL

    - by Bart Silverstrim
    I have a web server at www.ourcompany.com running Apache2. Using the proxy modules, I am able to (for example) get 172.16.0.5, an internal IP device, to be accessed on www.ourcompany.com/device. The trouble is that anyone can play with or explore the device using strings sent to www.ourcompany.com/device/change/settings/here.html. I'd like the reverse proxy to only work for a specific URL; www.ourcompany.com/device/you/must/use/this while anything else will be rejected if requested. Is there a setting that can be used to do this, or is it a simple rewrite condition placed in the virtualhost for the site under sites-enabled? What is the simplest, most maintainable way to sanitize requests to the internal device through the reverse proxy? Running Apache2 on Ubuntu.

    Read the article

  • Web hosting providers for businesses (with offsite backups, disaster recovery options, etc.) [closed]

    - by Harry Muscle
    Possible Duplicate: How to find web hosting that meets my requirements? I'm wondering if anyone can point me in the direction of a couple of web hosting providers that are geared towards businesses. By this I mean providers that make it easy to create daily off site backups, are aware that websites require disaster recovery options and have these in place or are able to assist with them, etc. We currently have about a dozen sites with various providers, however, I've been asked to consolidate all of these into one provider and create a full disaster recovery plan. Unfortunately it seems like most providers are geared towards average users that don't require all these extra bells and whistles that businesses need. For example, HostGator, which is a very popular and well reviewed provider, doesn't even allow you to schedule full backups, they have to be manually requested via cPanel and then downloaded once available. If anyone can point out a couple companies that might be able to help with these sorts of things that would be much appreciated. Thanks, Harry P.S. I should also add that we are hoping to stay away from having to manage our own server, we're hoping for a fully managed solution like what HostGator would offer for example.

    Read the article

  • How can I choose between Linux and Windows hosting?

    - by Mohamad
    I am a relative beginner when it comes to choosing web servers and hosting plans. I'm about to signup for a hosting plan with GoDaddy. My main requirement is ColdFusion and MySQL. The plans on offer include Linux and Windows based plans. Which one should I choose, and why? I don't have a lot of requirements other than what I mentioned above. I never used Linux before but I doubt I'll ever need to do anything beyond tampering with my account. What are the main advantages of one over the other?

    Read the article

  • Sendmail encrypted

    - by user1948828
    I manage a website running on Apache. It has public and private areas. When people apply for an account to access the protected portions of the site, they do a TLS/SSL protected POST containing their information which is saved to a (hopefully) nonpublic directory on the server. Then I have a python script which takes URL Encoded POSTS with this user information, sends back a plaintext confirmation to the applicant, encrypts their information with a freeware java command-line utility to protect it (specifically this one: http://spi.dod.mil/ewizard.htm), base64 encodes them, puts them in a file as a mime attachment and uses sendmail to forward the file information to my (and several coworkers' scattered around the country) email account(s) on an Exchange server with Outlook clients. This has worked well for years, but is awkward because it involves manually decrypting the information on a windows box once it is received, using the above mentioned encryption utility. This significantly limits how many can be processed. I would like to be able to encrypt my information in a format that Outlook/Exchange can inherently understand and display so that these emails can be viewed simply by clicking on them. I do have company provided PKI public certs for all the people I need to send to, and am able to send/receive encrypted emails on Outlook manually, but would like to know how I can send to Outlook from apache/linux/python from the command line using the same PKI certs. Dont need to receive them, just send. Is there a utility that can do this? I had thought pgp might but I havent been able to figure it out.

    Read the article

  • Unable to print login-required images in IE

    - by Tim Fountain
    I have some images in a section of a site that require the user to be logged in in order to view. These images are served by a PHP script, which checks the user's login state and if valid, serves the binary data with the appropriate headers. This all works fine. The issue comes when a user tries to print one of these images. In Internet Explorer, when they go to print preview they get the broken image box with a red cross in the corner instead of the actual file. This is what gets printed also. All other browsers can print the images without issue. I have some images elsewhere on the site that are also served via. PHP but these don't require a login. These print fine. The PHP-powered HTML pages on the site that require a login also print fine in IE. It's just login-required images. The user hitting print preview does not seem to result in additional HTTP request to the server for the file. However I do see an additional HTTP request a few seconds later that comes from the same IP (may or may not be related), This request includes no host header, no REQUEST_URI and no user agent. The 'please login' page sends an appropriate 403 header. I've also added a far-in-future expires header to the image response itself to ensure that browsers can serve/print the files from their own cache but this hasn't made any difference. Why can't IE print the images and what else can I do to investigate or fix the problem?

    Read the article

  • SVG images grow and create scrollbars when on the server

    - by zuko
    Okay so I embedded some SVG images into my page and opened it locally on Chrome and it looked fine. I upload the same file to the server and look at the page online and the SVG images have grown by maybe 5-10% and are surrounded by scroll bars like they are overflowing. I think it probably has to do with my lack of knowledge on how SVG and Embed work. What's really puzzling me though, is that it works fine locally. (I have cache disabled.) Help? Thanks. Edit: code HTML: <embed type="image/svg+xml" src="content/web-logo.svg"/> There's no CSS on the image. I'm not sure if I was just wrong before or if I changed something I'm not aware of, but it doesn't appear to be actually changing size anymore. It just decides to stuff it into a scrollbox. pic: https://www.dropbox.com/s/wt1aufi7nl1fpyi/svg-problem.png

    Read the article

  • Do CDNs work with POST operations?

    - by iddqd
    I'm using a CDN (Level3) for the first time and I'm a bit confused. I'm accessing dynamic URLs such as http://cdn.mysite.com?getItem=1234 that return text data. Do CDNs work with HTTP POST operations? When i issue a HTTP POST operation, my "real" server receives this request every time, so I'm wondering if the CDN has a problem with POST operations. If i use HTTP GET it seems to work, i call the URL once (from my application), i can see my server receiving the request. If i call it a second time, the CDN delivers it directly, my server doesn't get anything. However if i open same the link manually from a second browser tab, my server is asked to deliver again, shouldn't it be cached by now? Many thanks.

    Read the article

  • My parked domain was de-indexed by Google - what to do?

    - by Programmer Joe
    I have a question about how to handle my domain. In a nutshell, I bought a domain last year from Go Daddy. My intention was to launch a real site with this domain and I have spent the last year working on my site. For the last year, I have been using the default Go Daddy page display for an up and coming site. When I first bought this site, it was indexed by Google - you could search for "alphabanter" and my site would show up on the search result page for Google. Several months ago, it seemed Google de-indexed my domain and if you type "alphabanter," my domain no longer shows up on the list of search results. However, if you search for "www.alphabanter.com", that's the only way it shows up in the search results for Google. Anyways, I am about to launch my site for real. However, I don't quite know if I can get my site back into Google's index. I have a few questions: 1) Was my domain permanently penalized by Google and removed from their index just because it was a parked domain? I don't believe I have done anything abusive other than using the Go Daddy default page for almost a year because my site was not ready. 2) Should I just launch my site, put a few backlinks to my site, and hope that Google indexes my site again? 3) Should I submit my site to Google at Google submit your content I assume getting Google to reconsider my site is the last option if none of the above works.

    Read the article

  • How can I call a URL as a cron job in Webmin?

    - by EmmyS
    (Possibly this belongs on stackoverflow, although it's not really a programming issue since the code works when run directly. If it needs to be moved, though, no problem.) I have a PHP file (which consumers a National Weather Service web service via SOAP, if it matters) that I need to run on a scheduled basis. I'm trying to set up a cron job in Webmin. If I use an absolute path to the file in the Command field, when I run it I get some strange errors: /var/www/html/mysite.com/test/ndfdXMLclient.php: line 1: ?php: No such file or directory /var/www/html/mysite.com/test/ndfdXMLclient.php: line 2: //: is a directory /var/www/html/mysite.com/test/ndfdXMLclient.php: line 3: //DOCUMENTATION: No such file or directory /var/www/html/mysite.com/test/ndfdXMLclient.php: line 4: //: is a directory /var/www/html/mysite.com/test/ndfdXMLclient.php: line 5: syntax error near unexpected token `"running client code",' /var/www/html/mysite.com/test/ndfdXMLclient.php: line 5: `error_log("running client code", 1, "[email protected]");' The actual code in my file for those 5 lines looks like this: <?php // *************************************************************************** //DOCUMENTATION FROM WEATHER.GOV ALL STORED IN xmlClientComments.txt // *************************************************************************** error_log("running client code", 1, "[email protected]"); The code runs perfectly fine when I run it directly in my browser, so why doesn't webmin recognize it as code? (The same thing happens if I enter the actual URL in the command field - http://mysite.com/test/ndfdXMLclient.php.) I've never worked with webmin before; most of our hosts' cron control panels allow cron jobs to run PHP files like this with no issue. Is there some trick to getting webmin to read php as actual php?

    Read the article

  • Correctly indexing multiple domains with same content in Google and others

    - by AJweb
    I have a client with a dozen territorial domains, like mydomain.co.uk, mydomain.fr, mydomain.de, etc Most of these domains hold a different language of the same dynamic content (shop), but some, like co.uk and .com, have the same language and content, except for some content customized to each country/domain in the front page, contact and other pages. I am aware that we should use the canonical meta tag to mark those duplicated contents, but, we want the co.uk to be present in UK ( indexed in google.co.uk ) and the .com to be present in US and other countries, for example, or least that is the goal. Is there anything we can do to "help" google determine the geographical meaning of each domain? If we mark with canonical tag the .com and co.uk sites, do you know how google will decide which one to show on a given search?

    Read the article

  • Setting Gmail as mail server

    - by Tim S.
    I’m in a slightly weird situation right now, and I don’t have sufficient knowledge to sort this myself without truly understand what I’m doing. Yesterday, I’ve registered a domain (.com) and ordered a VPS, attached to that domain. Chances are I may receive mail on my .com address to confirm the domain. Unfortunately, that domain is nothing, but an empty domain. Currently, there’s no mailserver that fetches my mail. Because I don’t have a mailserver available, I (temporarily) want to use Gmail. I prefer to add it to my existing, personal address, but I’m okay with creating a new account as well. I just want to read possible incoming mails. I’ve tried to set MX records to What do I need to do to get mail to a Gmail address? PS. I’m aware of Google, NSA, etc. PPS. I just want to receive mail. I don’t care if I can’t send via my domain. PPS. Detailed steps would be greatly appreciated, I’m a noob.

    Read the article

  • using Moniker.com's nameservers

    - by user7519
    I have a VPS with A2Hosting for which i need to upgrade the OS. However, they've changed their VPS packages and forced me to order a new one. I went with an "unmanaged" package and have only just realised that they do not provide any DNS service at all, not even nameservers. Support tells me that "since your domain is not hosted with us, but with Moniker, you would not be able to use these nameservers. Your domain registrar should have a set of default nameservers that you can use, then create a A record to point to" my IP address. Moniker does provide for using their nameservers but i'm confused about which "pre-defined zone configuration" to use. They are: Domain Parking Domain Parking with Email Forwarding URL and Email Forwarding URL Forwarding URL Forwarding & CoolHandle Email I just want to use their nameserver and then create A & MX records pointing to the VPS. What do they mean by forwarding? I get the feeling it's a service that i don't want. Or, is it that i need to have a pre-defined zone only temporarily, and THEN set the A & MX? Which of these should i choose.

    Read the article

  • More information about worldwide nodes how to get?

    - by Aubergine
    The context: Six hosts across worldwide were traced over week from UK. Ten thousands of lines to be parsed and analysed. And then I try to find any clue of geographical information and path - from where it jumps where. Then after Austria or Germany(each time different) I have mysterious 62.208.72.6 which in GEO LOC gives me Falklands Islands (which is where my target host is by the way, but before target host I still have 5 other nodes) Then I do whois for this 62.208.72.6 route: 62.208.0.0/16 descr: DE-ECRC-62-208-0-0 origin: AS1273 mnt-by: CW-EUROPE-GSOC source: RIPE # Filtered Why it says Europe now? How to understand this enigma code? I want to confirm more or less whether this is in europe or in falkland islands? But it can't be in FK yet as after next two hosts I get New York? Could you also tell me what does this CW-EUROPE-GSOC abbreviation means. (To preserve your sanity better not google, unless you already know it :-D) And the actual whois for the destination/target host, which completely destroys my head: route: 195.248.193.0/24 descr: HORIZON descr: Cable and Wireless Falkland Islands descr: Via Cable and Wireless Communications UK origin: AS5551 mnt-by: AS5551-MNT source: RIPE # Filtered How is it Via Cable and Wireless Communications UK if two nodes before I was in New York? Thank you guys,

    Read the article

  • Dealing with blackhat SEO companies and low quality link building competitors [closed]

    - by Mikko Ohtamaa
    I have often faced a case where the competitors of my client use SEO blackhat tactics where they contact a SEO company to do link building for their websites and products. Here is an example of a typical case of a fake blog created only for link building purposes A very low content article http://marshallfab.com/fundus-camera-explained.html in obvious fake blog: no author information, partially machine generated text, all blog posts are solely about link building Following the link you get to the promoted company page http://www.patternless.com/ ... which, unsurprisingly, links the SEO company homepage in the footer text http://www.affordableseofl.com/ ... who are not shy to advertise their Extremely aggressive SEO plan Does Google have any feedback channel where one could submit cases like this, so that Google would punish the link builders? Are there any means to bring these blackhat companies to pushame to damage their reputation?

    Read the article

  • Why is <my site url> not indexed by search engines? [closed]

    - by Henrik Erlandsson
    was indexed fine until about a year ago. The only thing I can think of is that search engines throw up at using h5 before h4, or that some person (fantasizing now) has reported my site as unsafe to every search engine. However, I'm not here to speculate. The site validates, and has an RSS feed on the front page, for Pat Morita's sake! To me, it looks like the kind of site search engines would feast on. It's got more than a dozen blogs on it, if nothing else. Hah. :) I was thinking you could identify basically what has changed in search engines (currently, google, yahoo, bing which used to work fine) the last year to make them not find news and blog articles on this site. The site was submitted to Google, oh, way back in 2006. With online crawler tests I get mixed results, some crawlers index fine, some go blank. I don't really know which ones are reliable and am looking to you guys for advice on that. Yes, I am prepared to again verify my site with Google and upload a sitemap, but that's not the topic here. I really would first like to know what change on the site last year could make search engines not index it. (Yees, the robots.txt is fine. Should be nothing to discourage bots there.) It's a very intriguing problem. One which I have yet to find the reason for but would like to know the reason for. Any and all input appreciated, but I would heavily enjoy pertinent advice the most. ;) Edit: Some google searches that don't show up include - aca630 All of which are posted in the news and blogs that are on the front page there. Now, these search terms are extremely specific as the term in is almost unique on the web and ACA630 is also a very qualified search term that can't be confused with mainstream search terms.

    Read the article

  • Tag link suggestion plugin for wordpress?

    - by Emerson
    Hi, every time I write a post I make sure I add links to wordsthat I have tags for. For example: "The economy of Brazil has improved in the last few years" this ensure that when people re-post my content, a lot of back-links will be created to my tags. This is quite a lot of work to do manually for every post. It would be cool if there was a plugin that would suggest tags to be applied when they match existing words in the text of the post. Is there such a thing?

    Read the article

  • Dotted subdomain name or new domain?

    - by Catalin Ilinca
    I have a company website hosted at www.BRAND.com (where BRAND is a generic name). The company want to develop a "micro website" for one of their campaigns, named "Inspired By BRAND". I have two directions: inspired.by.BRAND.com - which I personally don't like too much. I don't know why but I don't recall any web address similar to this one subdomain.subdomain.domain.com. inspired.BRAND.com - which I this is best suited for it. Fewer dots and similar to "more friendly" addresses subdomain.domain.com. Any hints, guidelines, any thoughts is well appreciated. Thanks in advance

    Read the article

  • How do I get Paypal or a merchant account for a marketplace style web site?

    - by Brett G
    I'm having trouble getting approved for a merchant account for my website. Basically I have expert users and users. Expert users provide a service through my website which they set their own rates. Users purchase the services, then pay me, I give 90% to the expert users. I have been told this is factoring.. Is the way around this, a system like freelancer.com does? Where users deposit money into their freelancer account, then pay for the services they won? What are the negatives to this system? What about sites like 99designs? They accept CC payments and then pay the winning designer. How are some sites doing this but I'm having so much trouble getting approved?

    Read the article

  • Looking for someone to point me in the right direction. I want to learn how to use hosted servers

    - by Leisure
    TL;DR: I want a Java program to run on a server, I want the server to forward a particular port from external to internal IP, I want store a few files on the server. Guides please. So I made a hack job Java program that acts as a server for my android application. It stores data in text files and HTML files, uploads them via FTP to my webhost, and manages socket connections (using port forwarding) with any phones connected. Right now I'm running it on NetBeans on my home computer. I know that it will probably slow down or crash once about 50 phones are connected at once. Is there any way I can run this program on a server with a high bandwidth? Can someone please find me a guide for that? I'm noob and don't know where to start looking. I seriously don't know anything about renting or using servers - I need a nice guide, and recommendations. My requirements for the server: Can handle about 2k socket connections at once Can run my Java code and store my txt files Can give me a port and an IP address so TCP/IP clients can be connected My budget: $50 CAD per month. Please someone set my ship sailing in the right direction, I really don't know where to look for resources.

    Read the article

  • Which hosted chat solutions offer the following?

    - by David
    I am looking for a chat room solution similar to the one on StackExchange to facilitate more responsive communication between the contributors on Open-Org.com. My criteria are the following: No Flash (this rules out more than half) Full history (meaning that it is possible to access all previous conversation for future reference. Very customizable No ugly IRC stuff filling up the chat view (I do not want to see who joined an who left etc.) No private conversations possible (this is just not in the spirit of Open-org.com) A hosted solution with a reasonable price. These criteria are so different from this question, so this is not a duplicate question. The service which matches this the closest is Chatroll.com. However, at 199$ per month their prices are outrageous.

    Read the article

  • Silverstripe: How can I disable comments?

    - by SamIAm
    My client site is built in Silverstripe, there is a news page, and it allows people to leave comments. Unfortunately we've got loads of spam emails. I'm new to this, is there any way we can disable the comment field by default? How do I do it? Alternatively is there easy way for me to install a spam protection? Update - Because this is someone else's code, I just realised that they have some sort of spam protection already, so we are trying to disable comments now. I have manage to set no comment as default by changing file BlogEntry.php static $defaults = array( "ProvideComments" => true, 'ShowInMenus' => false ); to static $defaults = array( "ProvideComments" => false, //changed 'ShowInMenus' => false ); Am I on the right track to disable comments by default? Also how can I stop on the news page showing xxx comments link? eg Test Posted by Admin on 21 June 2011 | 3 Comments Tags: P This is a test.... 3 comments | Read the full post

    Read the article

< Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >