Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 236/592 | < Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >

  • Creating an encrypted, web-based proxy

    - by Jason
    I have moved to Asia where my internet connection is censored and I'd like to check my messages from social sites which happen to be blocked. As virtually all proxy servers are blocked in this country, I've decided to attempt to roll my own encrypted proxy server. Please note, the key word here is encrypted—if the sniffer sees anything like f@c3b00k or w:k:p3d:ia travelling down the wire I'm had. I have a website hosted with GoDaddy (Windows with PHP 5.2 & IIS 7). Is there any way I can set up an encrypted proxy through this service? If so, how, and what open source tools are available to use?

    Read the article

  • Page displaying sections using opacity in CSS3 but without navigating or scrolling down [closed]

    - by Senthil Kumaran
    Here is my app - http://www.shalgreetings.com/ I am trying to override the scroll bar going down to a imagesection in CSS, so that whole app is visible with logo, header and other controls all the times when people navigate through different #sections. I am not sure where in the CSS, I am making the mistake as clicking on #sections traverses the page. Here is this app's original inspiration code, which has got this right. Anyone can point me where the problem seems to be in the above app?

    Read the article

  • I want to host clients' websites, but not their email. What's the easiest way to handle this?

    - by Phil
    My company lets non-technical users build their own niche industry websites on our server, which we host. they can currently point their nameservers at their registrar to us, which ends up with them no longer having access to their email if they've already set it up through said registrar. We don't want to interfere with their existing email, nor do we want to get into the business of setting up email for them through our service. Thus, having them point A records/cname to us would work, but is this too complex for a non-technie user? We thought of having them point nameservers to us but pointing the MX records back to them, but this is also beyond their scope. Is there an easy way to 'point records' at their initial state? Any other ideas/feedback?

    Read the article

  • De-index URL paremeters

    - by Doug Firr
    Upon reading over this question is lengthy so allow me to provide a one sentence summary: I need to get Google to de-index URLs that have certain parameters appended I have a website example.com with language translations. There used to be many translations but I deleted them all so that only English (Default) and French options remain. When one selects a language option a parameter is aded to the URL. For example, the home page: https://example.com (default) https://example.com/main?l=fr_FR (French) I added a robots.txt to stop Google from crawling any of the language translations: # robots.txt generated at http://www.mcanerin.com User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /*?l= So any pages containing "?l=" should not be crawled. I checked in GWT using the robots testing tool. It works. But under html improvements the previously crawled language translation URLs remain indexed. The internet says to add a 404 to the header of the removed URLs so the Googles knows to de-index it. I checked to see what my CMS would throw up if I visited one of the URLs that should no longer exist. This URL was listed in GWT under duplicate title tags (One of the reasons I want to scrub up my URLS) https://example.com/reports/view/884?l=vi_VN&l=hy_AM This URL should not exist - I removed the language translations. The page loads when it should not! I played around. I typed example.com?whatever123 It seems that parameters always load as long as everything before the question mark is a real URL. So if Google has indexed all these URLS with parameters how do I remove them? I cannot check if a 404 is being generated because the page always loads because it's a parameter that needs to be de-indexed.

    Read the article

  • Garbled text in server logs

    - by Glenn Dayton
    I recently looked over my server's logs and I found a bunch of garbled text. Here is a link to the full log, and here is a snapshot of what it looks like: ¹^œÌÓûFF™ÃŒ-ôÚÏàÃÒNRs§cÝi ~F#J"|³Ôq0ã~QQbA ¼¹¦’š¶É3œßå<ú€Ç©XAwdL?R°ÝbÒt©ôÇ·Æ…÷q˜ÇѺ| Þ,߯¡Êr yR¤Q¹Jêlš‘AzP\ ¦ÂY„ÉÉ,æ™ U™»ì³ÔÝáCÿ42‹Ö.nŽÉ2%ÓN8i4Œ®¿‘•"-se•䎿ÊÁ§€þ 8åv%'#Äpžs/ÙÍ:¡1ÑÖÃå ºu|Q®!ÏyÆ,­NR@¶ËȯRDkã=ÿÀܸ ›¼Ô ’ð>ÓÌBftdÃ8–é}‰[øbãÝÁ嘲b¾W n´tT­œpäNëëÔ ·RUÓP+ÅuKÁ£¬\âÌ®:J<ÍÁ0:Q%ª(Œ˜E-ÁI:ï™4®hæœT†«);°Çda@´#èì}‡£ü•{57ý]¼|øÓñð÷ÈÌð‡MkŠâ•C~$Óô#ÙV¾Núå.#Á]vôžóæ» V&8)%øVSž“±ÔQLåÓý1–ŽÃßQ$¹ýž")ÈûQcÄý_ÔüGP=s‹vq#Pmoo.tigertutorialscomµÐOKÃ0ð»Ÿâ‘ØH“ What is this? and is someone trying to do something to my website?

    Read the article

  • SEO for replacing blog content, but keeping the same page URL

    - by cphill
    This might not have any major impact on the SEO, but basically I have random blog at this URL: http://example.com/blog (not a real URL), that I am removing and replacing with a company blog. I want to use the http://example.com/blog URL address, but I'm not sure how this would effect my SEO since this random blog content that I am removing has the example.com/blog URL prefix. Would I just add a 310 redirect for those old blog articles and leave the basic /blog URL without any redirects?

    Read the article

  • .htaccess language redirects with seo-friendly urls

    - by jlmmns
    How do I setup my .htaccess file to detect several languages, and redirect them to specific seo-friendly urls? Basically every url needs to go to index.php?lang=(...) So, for English language detection http://mysite.com has to go to http://mysite.com/en/ (index.php?lang=en) my .htaccess as of now (not working): RewriteEngine On RewriteCond %{HTTP:HOST} http://mysite.com/ RewriteCond %{HTTP:Accept-Language} ^en [NC] RewriteRule ^$ http://mysite.com/en/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^de [NC] RewriteRule ^$ http://mysite.com/de/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^nl [NC] RewriteRule ^$ http://mysite.com/nl/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^fr [NC] RewriteRule ^$ http://mysite.com/fr/ [L,R=301] RewriteCond %{HTTP:Accept-Language} ^es [NC] RewriteRule ^$ http://mysite.com/es/ [L,R=301] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-l RewriteRule ^(en|de|nl|fr|es)$ index.php?lang=$1 [L,QSA]

    Read the article

  • What is the name for landing pages that are one long page?

    - by blunders
    Really don't see them much anymore, but here's an example of what I mean: From comments: These are "high pressure" sales pages, design to overload the user with information, sell them on the belief that what they're buying is what they need, normally have a lot of testimonials, highlighted text, etc. The pages I'm talking about are not user friendly, they're aggressive sales pitches designed to target users wanting to belief the webpage they just landed on will solve there problems for an "affordable" price. Here's an example: www_landingpagecashmachine_com (remove the underscores, since I'm attempting to avoid linking to a site like that...) Bonus points: if you're able to tell me the name of the guy/company that popularized these types of pages; recall hearing about his company years ago, after he died in a crash while racing on a track with his Ferrari club on the west coast of the US. (Update: Appears Corey Rudl was the guy's name, and his company was called "The Internet Marketing Center." Even with that info, I've still been unable to find the name for these type of pages.)

    Read the article

  • Money in from website

    - by oshirowanen
    EDIT 1: It seems that paypals micropayment system is currently my best option to retaining as much of the $1 as possible. Does anyone know of a way to retain even more of the $1? ORIGINAL QUESTION: I need to receive money from users from my webpage. They will only pay very small amounts, i.e. $1 max, but the total will probably go upto $10,000.00. What is the best way to receive this money from a webpage? When I say "the best way", i mean a method of getting the money from where I lose as little of it as possible in terms of fees for receiving the money.

    Read the article

  • Canonical links for huge websites

    - by Florin
    Let's say I have 5 products that are identical but the product code, the product color specifications and the product image. The title, meta and description are identical (by the way the color is in a select form). I made 4 products link canonical to the 1 that is the master based on many factors. If the master becomes inactive or without a stock one product from the other 4 will become the new master and the rest will become canonical to it. The question is if that by becomeing master from canonical will the site suffer a penalty from Google or it will work just fine? What will Google think about this strategy?

    Read the article

  • Receive the Broadcast program with XML [on hold]

    - by bitmez4
    I have a channel publishing sites and I wanna get into the channel CNN broadcast program.. CNN broadcast the program here: (you can see in source - xml File) http://tvprofil.net/xmltv/data/cnn.info/weekly_cnn.info_tvprofil.net.xml How the data according to the time of withdrawal? For example: Now program: "bitmez's table" next program: "stack's table" in 30 minute Is this possible? UPDATE 1 // -I can take the XML data but to all of XML file- <?php if(!$xml=simplexml_load_file('http://tvprofil.net/xmltv/data/cnn.info/weekly_cnn.info_tvprofil.net.xml')){ trigger_error('XML file -- read error',E_USER_ERROR); } echo 'X-'; foreach($xml as $programme){ echo 'Now: '.$programme->title.' <br/>'; } ?>

    Read the article

  • Non-intentional hosting of material that is protected by copyright law

    - by spacemonkey
    I am interested how copyright law works in the case of Rapidshare, Youtube and etc. Just hypothetically speaking, what if I create a website for uploading and sharing MP3 files, and some users start uploading songs in MP3 format that are protected by copyright laws. Can I get sued for this? Knowing that it wasn't me who uploaded that content? Thanks! PS. also maybe there is some good source where I could read about law cavities related to copyright material?

    Read the article

  • How does Google Analytics aggregate the Count of Visits (Frequency & Recency Report)?

    - by Brian Dant
    Here's my simple understanding of Count of Visits: Each person that comes to my site gets one "count" for each visit. They are put into a bucket of people with the same number of total counts -- if you visit twice, you are in the two bucket, if you visit six times, you are in the six bucket. From there, a report (Frequency & Recency) makes a line for each bucket and reaches into the bucket and totals the number of people in that bucket, putting that total in the second column. My Question: Will a two month report automatically put someone into two buckets, and put them on two separate lines in the Count of Visits table? This explaination makes it seem like a two-month long report will put the same person into a bucket twice, one bucket for each month. The two-month report will then show that person's visits on two different lines, instead of aggregating them. Example for Clarification: Bob comes to my site three times in January and seven times in February. I run a report for Jan 1 -- Feb 28. Will Bob be on both the Three Count line and the Seven Count line, or will he be on the Ten Count line?

    Read the article

  • Facebook Comments Lost

    - by Rish
    I am using Facebook comments on couple of my blogs at the moment and I just found that somehow magically all the previous comments made on posts are gone and are no longer being displayed. I'm using wordpress for all of these blogs and Facebook Comments for WordPress to manage all the facebook comments. But somehow they all disappeared all of a sudden. Another problem which I've been facing lately is that I can't seem to moderate the faceboook comments. When I go to http://developers.facebook.com/tools/comments where there should be a list of all the comments made on my sites (against the Applications that I've created just for the sake of comments), there is nothing there. This has been the thing from the starting, before the comments vanished on my site, today So technically, there are two issues to solve here.

    Read the article

  • How to easily delete all email forwarders in cPanel?

    - by psoft
    I know that I can import a list of email forwarders using CPanel, but how can I delete a list? I want to manage 300+ addresses - as a membership list for my organization. I want to be able to delete that many without clicking 'Delete' and then 'Confirm' (or whatever it is) 300 times. Even if I am able to simply delete ALL forwarders, then upload a modified list - that's fine with me. Note: I'm using a shared hosting package through SiteGround. The tech service rep informed me that I can't use CPanel scripts in a shared environment. Any suggestions?

    Read the article

  • Make blogger load faster

    - by Wladimir Ivanov
    all. I use blogger as a platform for electronic music blog. Because of the thematics of the blog I embed many iframes (Youtube & Soundcloud). Of course this makes the articles to load slow. Almost each article on this blog consists of some text and many iframes below. What should I do in this particular case in order to make the articles (pages) load faster. Is there any available solution or I should use some jquery like lazy load to load iframes once the scroller reaches them? Any help is greatly appreciated.

    Read the article

  • Suggested ways of collecting 1000's of links to MSM media articles

    - by Matt
    I'm currently running a modified Wordpress site that is uniquely designed to simply publish links to other sites, similar to The Drudge Report. Right now I have a few dozen Google Alerts setup and go through each result manually and if it matches a few niche keywords I'm working with, then I add a link to the article to my site. I do the manual checking because sometimes Google Alerts finds links to sites that belong to service providers, organizations, or products, but all I want are mainstream news articles. So my question is there a more efficient - and ideally automated - way to go about performing highly qualitative searches and aggregating such links?

    Read the article

  • Track url from Amazon S3 using Google Analytics

    - by morktron
    I couldn't find any decent pay per view video solutions for low budget clients. So I'm considering using a membership extension with Joomla and hosting the video with amazon S3. The only issue is that once someone has signed up to view or download the video if they have any web development experience they will be able to get the url of the video and freely publish it on the web. How can this be prevented? It looks like it can be done using IAM User Temporary Credentials - AWS SDK for PHP but the client would prefer not to have to pay someone to spend hours writing custom php code to get this to work. With Amazon s3 I could at least check the log files I guess to manually monitor the url but is there a way to track the url with Google Analytics? or is there a more elegant solution?

    Read the article

  • Tracking AdWord ads with different text in Google Analytics

    - by at01
    I'm trying to see how the text in my Google AdWords ads affects my metrics in Analytics. I have auto-linking enabled, so I figured I would be able to automatically see this in Analytics. Unfortunately, if I try to add a second dimension of Traffic Sources-Ad Content, the metrics are only split by the ad's Headline. Most of my tests are changing only the ads' descriptions... So I guess I need to add a tracking parameter like ?campaign=special_text to my URLs? Or is there a way to see the ads split by ad descriptions? Should I add the full suite of utm_campaign/utm_medium/etc parameters? What's the proper way to track these ads which are mostly similar except the ad descriptions?

    Read the article

  • Which Bliki (Blog+Wiki) solution can you recommend?

    - by asmaier
    I'm searching for a good Bliki solution, meaning a combination of blog and wiki that I can install on my own web space. I would like to be able to write articles in the wiki style much like with media wiki. So I want to use a wiki markup language, have a revision history, comments, internal links to other pages (maybe in other languages) and be able to collaboratively edit the articles. On the other side I would like to have a blog-like view on my articles, showing new articles (and changes to existing articles) in a time ordered fashion. It would be nice if it would be possible to search through the articles and also tag the articles, so one could generate a tag cloud for the articles. A nice feature would also be to be able to order the articles according to views or even a voting system for the articles. Good would also be a permission system to keep certain articles private, showing them only to people logged in to the platform. Apart from these nice to have features an absolute must have feature for the Bliki platform I'm searching is the possibility to handle math equations (written in LaTeX syntax) and display them either as pictures like media wiki or even better using Mathjax. At the moment I'm using a web service called wikiDot which offers some of the mentioned features, however the free version shows to much advertisements, the blog feature is not mature, the design is quite ugly and loading of the page is often slow. So I want to install a Bliki solution on my own webspace. Can you recommend any solution for that?

    Read the article

  • Put a link on the nav bar in Wordpress

    - by Rafe Kettler
    I have a Wordpress blog. On the same domain, I have some other stuff hosted that isn't part of my WP install. I want to link to those other places on my domain from the top menu bar (nav bar) on my blog. How can I do that? The theme is Lightword, relevant header.php code follows: <body <?php body_class(); ?>> <div id="wrapper"> <?php lightword_header_image(); ?> <div id="header"> <?php lightword_rss_feed(); ?> <div id="top_bar"> <div class="center_menu"> <ul id="front_menu" <?php global $lw_remove_searchbox, $lw_use_wp_menus; $lw_menu_width = ""; if($lw_remove_searchbox == "true") $lw_menu_width = " class=\"expand\" "; echo $lw_menu_width; ?>> <?php echo lightword_homebtn(__('Home','lightword')); ?> <?php if ( function_exists('wp_nav_menu') && $lw_use_wp_menus != "true") { $lightword_menu = wp_nav_menu( array( 'menu' => 'lightword_top_menu', 'echo' => false, 'menu_id' => 'front_menu', 'container' => '', 'theme_location' => 'lightword_top_menu', 'link_before' => '<span>', 'link_after' => '</span>' ) ); $lightword_menu = preg_replace( array( '/^<ul id="front_menu" class="menu">/', '/\n<\/ul>$/' ), '', $lightword_menu); echo $lightword_menu; }else{ echo lightword_wp_list_pages(); } ?> </ul> </div> <?php echo lightword_searchbox(); ?> </div> </div> <div id="content">

    Read the article

  • Getting a domain sub-directory url for a new server

    - by Xianlin
    I have an web application server running tomcat and i need to publish my APIs to internet users. However I don't have a domain name for this server and I can only put the ip address of this server (e.g. 145.XXX.XXX.XXX) to point out where my API xml files are located. I have another web server running with a domain name "http://www.webserver.com" registered on the internet and I want to make use of its domain name to server my web application server API xml files location. How can I do that? using "www.webserver.com/api" or using "api.webserver.com"? which is better? Also I wonder if I want to publish a "rstp://145.XXX.XXX.XXX" web link for video streaming purpose, can I use "rstp://www.webserver.com/api" to replace it and how to do it? I always thought the url contain domain sub-directory name cannot point to another IP address, it only can point to another folder location on the webserver itself.

    Read the article

  • Social media measurement tools [on hold]

    - by user29187
    I work for a non-profit and we are currently trying to develop a system to measure and evaluate our social media platforms (Facebook and Twitter). We would liek to tarck a variety of social metrics, including, but not limited to: Twitter: # of followers, # of mentions, # of retweets Facebook: # of likes, # people talking about, # of page views We are currently using a paid platform for this. I wondered if there is a way to configure Google Analytics to do this or if there are any other free/clever or smart ways to track social engagement with our brand?

    Read the article

  • Does google use chrome to check if a link is used by humans or just there for the bots?

    - by sam
    Does clicking a link in chrome tell google the link is used by humans and there fore not just automated backlink spam. It sounds weird but i read it today on a slightly obscure seo blog, they mensionned clicking the backlinks they make in a version of chrome where they have the "send data annonmusly to google" feature turned on. It sounded a bit far fetched but then i thought it could have a truth to it as with google now looking harder at "spammy" links it would mean at least some humans are using them. Has any one else heard anything else about this ?

    Read the article

  • Slashdotted web site seeks new home

    - by Arthur Edelstein
    I am maintaining a website that contains mostly simple html (just a little php). Normally the site receives only 4000 hits per month, but it was recently slashdotted by the New York Times (30,000 visitors and 30 GB in a day) and the web host provider (bluehost) throttled the CPU in response. This slowed down the website considerably. What web host providers would offer a more scalable solution? Ideally I would like a high-quality host that charges by the GB and can handle bandwidth to expand during sudden slashdotting episodes without a reduction in performance.

    Read the article

< Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >