Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 102/389 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • Webpage loading with wrong content-type after setting up CloudFlare

    - by Daniel Little
    I recently migrated my blog to the Ghost service, I've also setup an alias DNS record with CloudFlare. While showing the blog to a colleague I discovered one of the posts wasn't loading properly and would instead prompt to be downloaded with an application/octet-stream content-type. I can view all the pages without any issues and I believe we're both on the same network as well. Has anyone received a wrong content type like application/octet-stream using CloudFlare, or know what I can do to correct this?

    Read the article

  • Assigning Static Public IP Address to Windows Server 2008

    - by Neeti
    Please help a newbie. I am new to windows server. I have an IBM server and I have installed Windows Server 2008 R2 on that. I am provided with a static IP address by my ISP. How I can assign that to my server? I have a webapplication hosted on the server which I require to access from the external world using internet browser. How can this be achieved? Please let me know if there are any tutorials or step by step guide for achieving what I am trying to.

    Read the article

  • How to avoid Hotmail/Live rejections for (legit) large volume eMailing?

    - by vmarquez
    While qualifying eMail for Spam, Hotmail/Live checks the historical records of numbers of eMails sent by a sender (FROM, eMail Server, IP, etc.). Some times, perfectly valid bulk eMails that are not Spam, (i.e. double opt-in list, from a server with proper SPF Record, signed with DKIM, unregister links and contact info, etc.) are rejected and not delivered to destinataries. Not even to their Junk folder. I guess we can avoid this situation by progressivelly "training" Hotmail/Live about the reputation of our sender and sending small quantyties of eMails innitially and increasing the quantity for some amount/percentaje during each delivery. Are there guidelines or do you have any experience on these quantities, strategy, solutions? Thank you in advance. EDIT: This question with a bounty is still unanswered. 8 hours to be automatically awarded! Do you have the answer?

    Read the article

  • What's the ideal setup for an news minisite for an app [closed]

    - by Leonardo Amigoni
    I am mobile app developer, I would like my application to check for news about new updates of the app when the user opens it. I am unfamiliar on how I would check from a server if the news are actually new or have already been read. If they have not been read, I could of course display them in the app. Can anyone point me in the right direction on how to achieve this? Something similar to an RSS feed but on mobile. Thanks

    Read the article

  • In-page Google Analytics giving no page views recorded

    - by Nicolo77
    I am trying to use Google In-Page Analytics. The rest of Google Analytics seems to work correctly on my site, but when I go to the new In-page analytics, I get no click appearing. I just get an error saying "There are no pageviews recorded for this page. Try adjusting the date range or select an alternate page." To the left in the content details it tells the number of page views. Do I need to setup something special for In-Page anayltics to work?

    Read the article

  • What a web developer can learn [closed]

    - by knoxxs
    There are many things to learn in web development. You can easily find what are the most important thing that you need to learn if you want to be a webmaster. Answer to questions about how to become a web developer or a webmaster only contained limited items that someone need to master. (Some eg - a, b ) But the problem is that these resources are not complete. When I started learning web development i follow the same steps. But after learning the basic development I didn't know that I have learnt nothing, there are many more things to learn. I realized this by following blogs , Q&A sites. When I first downloaded the HTNL% Boilerplate, the issue that they have covered, some of them I haven't even heard about. I want you to just suggest what are the possible things, issues that someone can learn and why to learn. I know the answer is follow blogs and do your work you will learn with time, but with these platforms I could get some benefit out of other experiences. This question is not how to become a webmaster, but answer to this may also cover that too.

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • Blogger widgets speed problem

    - by Wladimir Ivanov
    Recently I installed Google Analytics on a Blogger account. I was shocked when I saw load times for the landing pages between 10 and 60 seconds. The blog uses Facebook like-box, twitter recent messages box, live traffic feed widget and Lockerz share buttons. Almost every post in this blog contains YouTube iframes which aren't nowhere near fast. Are there any well-known solutions for this type of problems? Should I use some jQuery plugins for speed optimization, how can I make the facebook/twitter boxes load faster?

    Read the article

  • Google doesn't always show rich snippets when the site uses structured data [duplicate]

    - by Sam Se
    This question is an exact duplicate of: Google Structured Data [on hold] 1 answer I'm so tired of the Google structured data recipe. After some days, it loses the image and the extra information. Then I test it again, and it shows again. Some other days in the future it might go away even if it is still showing in test tool. What i can do? I tried with RDFa and schema.org microdata.

    Read the article

  • One domain and multiple website in folders

    - by User1212
    I am going to create a network with one domain, e.g. example.com then going to manage my websites in folders. Look below for example: www.example.com/market www.example.com/freebies www.example.com/personalblog www.example.com/shop Consider that all four websites have different design and codes. From SEO perspective, is it recommended or I should use subdomains or buy four domains for each website?

    Read the article

  • adverising servers / advert delivery solutions for C#/Asp.Net

    - by Karl Cassar
    We have a website which we want to show adverts in - However, these are custom adverts uploaded by the webmaster, not the Google adverts, or any adverts the network chooses. Ideally, there would be both options. We were considering developing our own advert-management system, but looking at the big picture, it might be better to consider other alternatives. Website is currently developed in C# / ASP.Net (Web Forms) Are there any recommendations to some open-source delivery networks and/or external hosted advert delivery networks? Personally I've used Google's DFP, however sometimes it is not so easy to get a Google AdSense account approved, especially while developing a new website and it not yet being launched. Not sure if this is the best place to ask this kind of question!

    Read the article

  • Moodle Inconsistencies, Flowplayer, or Server?

    - by dglickler
    We are trying to decide if an issue we're having with Moodle and our videos is a flowplayer issue, a video issue, or a network issue. Any input is welcome. We've had videos in our Moodle (version 1.9, we're working on an upgrade on a different server) up for weeks, or even months. After that time, some of them have suddenly just stopped working. When they don't work, they just don't load. The videos work when we first upload them. With flowplayer, we don't get errors, just a blank screen. We have re-uploaded our videos several times when this has happened, but we really would like to know what's causing it so that we can prevent it. There are also no keyframe issues with the video. We are currently trying to find answers through various searches, but have not had any luck. I will continue to post more info as I come across it...but if there's anyone who knows or has ideas, it's welcome.

    Read the article

  • How can I had some contents from Chrome/Chromium browsers?

    - by MIH1406
    I need to put a "Bookmark us" in my website. But as I searched using Google all the results conclude that no way to do "Bookmark us" for Chrome/Chromium browsers. So I want to either: 1- Hide the content from chrome/chromium browsers. or at least, 2- Show a message if the user's browser is chrome/chromium after clicking that buttong. Here is my "Bookmark Us" script: /** Bookmark Us */ function bookmark_us(url, title){ if(window.sidebar) // firefox window.sidebar.addPanel(title, url, ""); else if(window.opera && window.print){ // opera var elem = document.createElement('a'); elem.setAttribute('href',url); elem.setAttribute('title',title); elem.setAttribute('rel','sidebar'); elem.click(); } else if(document.all) // ie window.external.AddFavorite(url, title); } else { } /** Bookmark Us */ <a href="javascript:bookmark_us('URL','TITLE')">Bookmark Us!</a>

    Read the article

  • Reset / Remove - Google Keywords

    - by Herr Kaleun
    Summary: My site is ranking for filthy keywords and i would like to remove them from google ranking/keywords. Background: My server was hacked using the timthumb exploit/security vulnerability, apparently i was the last person on earth to read the news about the exploit, several months after it appeared. Anyway, the "hacker" was so friendly to modify the index.php file in such a fashion, that it generated random sexual oriented keywords if the website is fetched as google-bot. So if you would fetch it as google bot/it gets indexed, you would get randomly generated keywords like: sex videos teenager teen sex adult sex preteen A LINK TO A RANDOM CONTENT OF MY WEBPAGE anime sex videos a rough list something similar to that, about 180-200 per page. I've discovered it far too late, so that google had me indexed for the words "sex" and certain adult oriented keywords, about roughly 2000. I've removed all the content, toke the site down, replaced the index.php with a static HTML and added a "ERROR 410" title to the website so that the content is no longer here and removed permanently. I've also applied for a manual review of my website, about 1.5 months ago but still, the keywords are there, and very strange, some of the keyword rankings actually "improve" over time. Here are some screenshots from webmasters tools: Question: How can i remove this filthy keywords and re-rank my website as a "normal" website on the fastest way? I want to "REMOVE" the keywords if possible. Please help me or point me into a direction. Thank you

    Read the article

  • wordpress feeds not indexing in webmaster tools

    - by jogesh_p
    I don't have much experience about webmaster tools, i just know the basic of the webmaster, and i am not from SEO background, but i just want to know that: Why my blog's RSS Feeds not indexing from webmaster tools? i want to know about Crawl Stat is this stat is good or bad? To submit the RSS into the webmaster is good for indexing the pages or not?? i also submitted the sitemap. the link of the website is Webtech Eleven

    Read the article

  • general questions about link spam

    - by hen3ry
    Hello, A CMS-based site I manage is suffering from a small but ominously growing number of almost certainly bot-emplaced, invisible spam links placed in registered-user-only shoutboxes and user forums. "Link Spam", yes? Until recently, I've kept my eyes on narrow tech issues, and I'm having trouble understanding what's going on. I understand that we need to tighten up our registration procedures, but more generally... Do I understand correctly that our primary interest in combatting link spam on our site is that major search engines reduce or zero the search visibility of sites that contain link spam? Although we're non-commercial, we don't want to be at the bottom of the rankings, or eliminated altogether. Are the linked-to sites the direct beneficiaries of the spam links, or is there some kind of indirection? What is the likely relationship between the link-spammers and the owners of the (directly or indirectly) linked-to sites? Are the owners of the linked-to sites paying the link-spammers for higher visibility? Are the owners aware that this method is being used? It is my impression that major search engines are capable these days of detecting that given sites are being promoted by link spam, and that these sites may consequently be reduced in search rank or dropped altogether. Do these sanctions occur frequently? Is there any potential value in sending notifications to the owners of the linked-to sites that their visibility is at risk? TIA, hen3ry

    Read the article

  • What is the proper way to setup Google Apps email accounts for a subdomain?

    - by binaryorganic
    Let's say I've registered domain.com at enom and set it up to use Google Apps for email by rerouting DNS to enom's servers and editing the MX records there. That works flawlessly. Now let's say I want to have email at a subdomain for that same site. I already have a working subdomain at the host, but I want to catch email traffic at enom before it gets that far. I've set up Google Apps as a new account for the subdomain, successfully verified domain ownership, and now they want me to update MX records. What's the right format? For domain.com, I just put @ for the hostname, and then provided the Address and Pref values that Google gave me. I tried putting subdomain.domain.com as new values under hostname for the subdomain, but that doesn't seem to work. What am I doing wrong?

    Read the article

  • Redirect from https://mydomain.com to http://mydomain.com

    - by Charlie
    Many of my visitors have bookmarked my site already wtih https://mydomain.com. Under the bad advice of a programmer, I put my whole Joomla site under ssl. I do not sell anything or provide any member services. I asked him many times if it would slow my site down - he said it wouldn't. I knew it did, I've researched on this site and realized it does slow the site down because of no cache of the pages. Understood. Please, someone tell me how to get away from it now. I'm not sure how to approach this, should I add something to my htaccess or my main index.php file? I've looked all over the net, there is much advice for adding redirectives for going from http to https, but very few answers regarding the opposite of going from https to http. Thank you very much for your time. I appreciate it.

    Read the article

  • Log oddities: 404s for client-garbled image URLs

    - by Chris Adams
    I've noticed some odd 404s which appear to be broken URL rewriting code: Our deep zoom view generates images URLs like this: /media/204/service/dzi/1/1_files/7/0_0.jpg I see some - well under <1% - requests for slightly altered URLs: /media/204/s/rvice/d/i/1/1_files/7/0_0.jpg These requests come from IP addresses all over the world (US, Canada, China, Russia, India, Israel, etc.), desktop and mobile users with multiple user-agents (Chrome, IE, Firefox, Mobile Safari, etc.), and there is plenty of normal activity in the same session so I'm assuming this is either widespread malware or some broken proxy service. I have not seen them from anything other than images, which suggests that this may be some sort of content filter. Has anyone else seen this? My CDN logs show the first request on June 8th ramping up from several dozen to several hundred per day.

    Read the article

  • IIS - HTTP Redirect all requests for one virtual directory to another

    - by nekno
    How do I set up an HTTP Redirect rule to redirect all requests for a virtual directory to another virtual directory, when I don't know the hostname or complete URL, and cannot use the URL Rewrite module? The following redirects should work: http://host1/app/oldvdir -> http://host1/app/newvdir http://host1/app/oldvdir/ -> http://host1/app/newvdir/ http://host1/app/oldvdir/login.aspx -> http://host1/app/newvdir/login.aspx http://host2/app/oldvdir/login.aspx -> http://host2/app/newvdir/login.aspx I would like to place the redirect rule in the app's root web.config. I have attempted the following rules, but the end result is simply that the redirected vdir gets duplicated on the end of the original vdir until reaching the max URL length, e.g., http://host/oldvdir/login.aspx -> http://host/oldvdir/newvdir/newvdir/newvdir/... Rules in root web.config (I also have tried all sorts of combinations of settings with and without leading and trailing slashes, etc): <location path="oldvdir"> <system.webServer> <httpRedirect enabled="true" exactDestination="false" httpResponseStatus="Permanent"> <add wildcard="*/oldvdir/*" destination="/newvdir/"/> </httpRedirect> </system.webServer> </location> <location path="oldvdir/"> <system.webServer> <httpRedirect enabled="true" exactDestination="false" destination="/newvdir" httpResponseStatus="Permanent"/> </system.webServer> </location>

    Read the article

  • How important is the choice of domain registrars? [closed]

    - by Harry Muscle
    We're consolidating all of our hosting providers onto one provider, however, this provider is strictly a hosting provider, they are not a domain registrar. The question I have is how important is the choice of registrars? Or in other words, if I point the domains to the hosting companies name servers, can the reliability of my registrar affect my websites in any way? If the registrar were to go down for a few days would that impact the accessibility of the websites? What would it impact? Thanks, Harry

    Read the article

  • What to choose for a multilingual site with support for Markdown and commenting

    - by Kent
    I want to publish articles at a multilingual site. I want to be able to write an article in two languages and have them available on separate URLs: thesite.foo/english-breakfast thesite.com/engelsk-frukost If the users web browser is set to English I'd like to show a small notice at the top of the Swedish version with a link to the English one. The link should have an appropriate rel attribute for a translation (search for hreflang at http://diveintohtml5.org/semantics.html). There should be a way to list all articles belonging to these sets: Swedish only, English only, Swedish versions + English only, English versions + Swedish only. I'd like to publish these as four RSS-feeds. And I would like to have two versions of the main site, one in Swedish (showing Swedish versions + English only) and one in English (showing English versions). I shall be able to write the articles using Markdown, as that is the formatting language I find most convenient. There should be a way for users to comment. And some kind of way for me to protect myself against comment spam. I am leaning towards learning Drupal. I suspect I'll have to code this behavior myself as a module. To be frank I'd rather work with Java. Is Drupal the way to go? Or is there something more suitable for this project?

    Read the article

  • Ensure we're found in Facebook search for both full & abbreviated company names?

    - by hawbsl
    We have a client with a facebook page, let's say his company is called Bob Roberts Super Widgets. And if you search in Facebook for Bob Roberts Super Widgets then up he pops. But the shorthand he's commonly known by is BR Super Widgets and indeed the website we've created for him is br-super-widgets.com. In Facebook, searching for BR Super Widgets doesn't show up our Mr Bob. We don't have a lot of Facebook expertise, so asking for help here. Does anyone know how to ensure you're found in Facebook search for both short and long company names? Have found this this similar question in the Facebook forum but the poor old questioner never got a response.

    Read the article

  • Is there a way to disallow only crawling in https in robots.txt?

    - by David Wilkins
    I just realized that Bingbot is crawling my company's website's pages over https. Bing already crawls the site over http, so this seems frivolous. Is there a way to specify Disallow: / for https only? According to Wikipedia, each protocol has its own robots.txt And according to Google's Robots.txt Specification, the robots.txt applies to http AND https I don't want to Disallow: / for Bing totally, just over https.

    Read the article

  • Multiple 301 redirects, do search engines/viewers see them all?

    - by Karim
    I've put in place lots of different 301 rules to deal with numerous url changes. And for certain URLS there are 3-4 different 301 redirects landing the visitors to the new URL. I heard that 301 loses pagerank/linkjuice. ALl the 301 are onsite for the same domain. With a mix of php 301s and htaccess 301s. so for instance articles/news.php?id=2 --- articles/blog.php?id=2 [filename change] articles/* --- /* [subdir to root] /blog.php?id=2 --- /title-of-post [mod rewrite url change] so if you were to visit /articles/news.php?id=2 there will be two 301 redirects until you land on the /yellow-wellington-boots/, my question is does google see the intermediate redirects, or just the final page the 301's redirect to.

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >