Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 251/476 | < Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >

  • Is it possible to get free web host for my registered domain? [closed]

    - by Ahmed Alsayadi
    Possible Duplicate: How to find web hosting that meets my requirements? I searched online for many free web hosting websites like NetFirms, most of them asking you to register for their sub-domain or to buy a new domain, but I already have one which I bought through GoDaddy. Now I hope to find a free web host for my website (site's size less than 20 MB). Any idea which web hosting can meet such requirements?

    Read the article

  • How to phrase the from field in system generated emails my site sends?

    - by Genadinik
    I have a community site that sends emails after certain actions like 1) When someone makes a comment 2) When someone does something called "suggest solution" 3) When someone makes a comment in the suggested solution which is different from a regular comment. What I am wondering is what is the best way to make the from field of the email appear? Right now it is something like 1) [email protected] 2) [email protected] 3) [email protected] But 2 and 3 look so strange when receiving the email. What is the nice and professional way to send these? Thanks!

    Read the article

  • Apache FilesMatch regexp: Can it match by the cache buster 10 digit (rails generated) following the filename?

    - by ynkr
    According to the apache FilesMatch docs: The FilesMatch directive provides for access control by filename Basically, I only want to set an expires header for resources that have a 10 digit "cache buster" id appended to the name. So, here is my attempt at such a thing in my httpd.conf <FilesMatch "(jpg|jpeg|png|gif|js|css)\?\d{10}$"> ExpiresActive On ExpiresDefault "now plus 5 minutes" </FilesMatch> And here is an example of a resource I want to match: http://localhost:3000/images/of/elvis/eating-a-bacon-sandwich.png?1306277384 Now obviously my FilesMatch regexp is not matching so I am guessing 1 of 2 things is happening. Either my regexp is wonky or the '?1231231231' cache busting part of the file is not part of what apache considers part of the filename. Can anybody confirm and/or give me a way to cache only those resources that will not persist beyond the next deploy?

    Read the article

  • How to run WordPress and Java web app running on Tomcat on the same server?

    - by Chantz
    I have to run a WordPress site served via Apache2 & Java-based webapp using Tomcat on the same server. When users come to example.com or example.com/public-pages they need to served from WordPress but when they come to example.com/private-pages they need to be served from the Tomcat. I have asked this question on serverfault where they suggested using different port, different IP & sub-domain. I want to go for different port solution since it will mean I need to buy only one SSL certificate. I tried doing the reverse proxy method by having the following in my default-ssl.conf <VirtualHost _default_:443> ServerAdmin webmaster@localhost ServerName localhost:443 DocumentRoot /var/www <Directory /var/www> #For Wordpress Options FollowSymLinks AllowOverride All </Directory> <Proxy *> Order deny,allow Allow from all </Proxy> ProxyRequests Off ProxyPass /private-pages ajp://localhost:8009/ ProxyPassReverse /private-pages ajp://localhost:8009/ SSLEngine on SSLProxyEngine On SSLCertificateFile /etc/apache2/ssl/apache.crt SSLCertificateKeyFile /etc/apache2/ssl/apache.key </VirtualHost> As you have noticed I am using mod_proxy_ajp in Apache2 for this. And that my Tomcat is listening to port 8009 and then serving content. So now when I go to example.com/private-pages I am seeing the content from my Tomcat. But 2 issues are happening. All my static resources are getting 404-ed, so none of my images, CSS, js are getting loaded. I see that the browser is requesting for the resources using URL example.com/css/* This will clearly not work because it translates to example.com:80/css/* instead of example.com:8009/css/* & there are no such resources in the WordPress directory. If I go to example.com/private-pages/abcd I am somehow kicked to the WordPress site (which obviously displays a 404 page). I can understand why #1 is happening but have no clue why the #2 is happening. Regardless, if there is another clean solution for resolving this, I would appreciate y'alls help.

    Read the article

  • Drupal node access for anonymous users

    - by MrDresden
    I've never used Drupal before so this may be something that can easily be remedied, and that would be awesome. My problem is that a block, containing node information can't be viewed by anonymous users (unregisterd/not logged in), gives a "You are not authorized to access this content." message, but shows up for logged in users. The nodes that the block contains are events, so the block shows events for the next week. I've checked the users access settings but can't find anything that could possibly remedy this. I'm using drupal core 6.26, Event 6.x-2.x-dev, Event views 6.x-2.4 If anyone has any information, or solutions, I'd greatly appreciate it.

    Read the article

  • Will these tool tips get me penalized?

    - by user21100
    I have a page of 10 products. Each product has a list of (7) features associated with it. If a user hovers over the name of the feature, (ex., "Moisture resistance), a tooltip description displays. The descriptions ( a sentence or two), are loaded once on the page using Javascript, but the titles are not, so I essentially have a bunch of redundant tool tip titles. I am concerned this will look like keyword stuffing to the bots. Anyone know about this? Maybe I should load the feature titles with javascript as well?

    Read the article

  • Should I create topics in a forum I'm about to launch so that new users won't feel it is "empty"?

    - by janoChen
    I'm about to launch a discussion forum about Taiwan. I'm really trying to figure out how to deal with the first visitors. I've thought about the following so far: Invite few friends to start some discussions and give some replies. Create discussions myself and reply them myself (with another account). I don't want the first visitors to feel like the site is empty. Maybe I'm missing something. Any suggestions?

    Read the article

  • How to keep google rank and index for a page that changed its url? [closed]

    - by ProSoft
    Possible Duplicate: How to tell Google that I have changed my website URLs? Recently, I changed URL of my web page. Of course, I do it by URL rewriting. And now, I want to keep the rank of this page in Google and Bing. For example: Main address of the page: http://mywebsite.com/page1.php Virtual address by URL rewriting: http://mywebsite.com/page And new address is: http://mywebsite.com/newTitlePage Now, when I open this page by search in Google, I face to 401 error (not found). How should I do it?

    Read the article

  • Can anyone list some real examples of 'HTML5' being used in the wild?

    - by betamax
    I am using HTML5 in the same way everyone seems to be using it these days, meaning: HTML5 tags, Canvas / 3D / javascript and CSS3. I am struggling to find examples of sites that are using these technologies practically and that are not just a demo of something cool someone has managed to do using Canvas or CSS3 transforms or shapes. I am looking for sites that have a nice visual look but also take advantage of things like animation, scrolling and offset à la Silverback or the Canvas to create an interactive and I guess 'Flash-looking' site. These are some examples that I have found: Scrolling http://nikebetterworld.com/index http://benthebodyguard.com/ Animation http://www.elladesign.com/contact.html Other http://www.pirateslovedaisies.com/ I am using HTML5 loosely and I hate to be using it. I would be happy if you listed a really visually appealing Javascript-based site but it didn't have the HTML5 doctype.

    Read the article

  • Another website is mirroring and ranks above my site in search results

    - by Marlboro Goodluck
    There is a site of ill-repute known as thedirty which has completely mirrored my site and now has links appearing on Google at the #1 spot using my content. I checked my log files and noticed that this site has been crawling mine for sometime, and also has 10,000 links from their site to mine. I have blocked user access which is referred from this site and reported them as web spam to Google already. I also disavowed the domain. How are they getting top links in Google (even overtaking mine) for such nefarious tactics? What are the steps to completely eliminating an issue such as this?

    Read the article

  • Best approach to creating self-updating content - i.e. chat rooms, shoutboxes and so on

    - by Anonymous -
    The only way I can think of to have a shoutbox or similar element update itself when somebody posts a new 'shout' and it needs to be loaded in everyone else's browsers is to have Javascript check every x seconds for any updates... This could get a bit resource intensive though I expect if many people were to leave their browsers open on the page, idling. Is this the only way or am I missing something? I've prefer to stick to only html, css, javascript (AJAX) and php.

    Read the article

  • How do spambots work?

    - by rlb.usa
    I have a forum that's getting hit a lot by forum spambots, and of course the best way to defeat something is to know thy enemy. I'll worry about defeating those spambots later, but right now I'd like to know more about them. Reading around, I felt surprised about the lack of thorough information on the subject (or perhaps my ineptness to input the correct search terms for better google results). I'm interested in learning all about spambots. I've asked on other forums and gotten brush-off answers like "Spambots are always users registering on your site." How do forum spambots work? How do they find the 'new user registration' page? (I'm especially surprised because some forums don't have a dedicated URL for this eg, www.forum.com/register.html , but instead use query strings or even other methods invisible to the URL bar) How do they know what to enter into each 'new user registration' field? How do they determine what's a page they can spam / enter data into and what is not? Do they even 'view' this page at all? ..If not, then I'd assume they're communicating with the server directly - how is - this possible? How do they do it? Can forum spambots break CAPTCHAs? Can they solve logic questions (how?)? Math questions? Do they reverse-engineer client-side anti-bot validation scripts? Server-side scripts? What techniques are still valid to prevent them? Where do spambots come from? Is someone sitting behind the computer snickering as they watch their bot destroy site after site? Or are they snickering as they simply 'release' it onto the internet somehow? Are spambots 'run' by an infected computer somewhere? Do they replicate themselves? etc

    Read the article

  • What are the dis-advantages of installing the ssl certificate for the naked domain?

    - by user1744649
    I might buy an SSL certificate for my sie. I know that it will help me in many ways. But will there be dis-advantages also? eg. If I load an image from another server (using plain http), will that alert the customer saying something is wrong? Will I be able to use all existing codes like phpbb, awstats etc without a problem? Will there be any issue if redirect a page from my domain.com to my subdomain.domain.com using a meta refresh or .htaccess? Will there be any issue if redirect a page from my subdomain.domain.com to my domain.com using a meta refresh or .htaccess? Any other issue that I might get into? Thanks.

    Read the article

  • How to interpret Google's "Avg. Page Load Time"?

    - by hawbsl
    Is there any industry rule of thumb for what's considered an unacceptable load time v. an OK one v. a blistering fast one? We're just reviewing some Google Analytics data and getting 0.74 Avg. Page Load Time reported. I guess that's OK. However it would be good if some meatier comparison data were available, or a blog post, or somewhere where there's some analysis of what speeds are generally being achieved by various kinds of sites. Any useful links to help someone interpret these speeds? If you Google it you just get a lot of results dealing with how to improve your speed. We're not at that stage yet.

    Read the article

  • How to use Google Analytics as an affiliate to track sales data

    - by lalex
    As an affiliate, how can we get more information on sales? It looks like the goals feature in GA is for those who have control over the receipt page. But we are sending users away using an affiliate link. With event tracking, we've been able to count the clicks and see which links are being clicked the most, but not which ones actually convert. We want to find out the following on each sale: Did the converted user come from search or internal traffic? If it was search, which keyword brought the user to our site (and clicked away and converted)? Is it possible?

    Read the article

  • Site inaccessible by some people, fine for others [on hold]

    - by Paul Howell
    A couple of days ago my website www.howellphoto.com (hosted by one.com, wordpress site) started loading really slowly, and I have been unable to access any pages linked from the homepage. Several of my friends have found the same issue, yet many are able to access the site without problem. Live support at one.com have not been all that much help, requesting the ip addresses of a few people who cannot access the site, and saying it could be a firewall issue. Wordpress support (my site was created in prophotoblogs) have been better and have updated all plugins, etc, but can see no issue from their end. My main issue is that even if there was a local fix that I could do on my computer, this would not help wih any potential customers visiting my site for information! This is driving me crazy!!! Any help will be legendary! Cheers, Paul

    Read the article

  • Need private personal access to ~three PHP pages

    - by Roger
    I would like secure access to the text output by three PHP scripts (the text output is JavaScript and html) . The security level is much less then financial data but important none-the-less. I have considered purchasing AND studying https and SSL certificates. Hostgator charges an extra $2/month for a private ip plus $50+ anually for a certificate. This is more then I want to spend for this project (time + money). Is there a simpler solution that is: less expensive easier to implement. I'm open to different approaches.

    Read the article

  • Restricted Flow Of Power

    - by user13827
    I'm sure all is fine, but i need some reassurance. Last month my company launched consolidated two of their websites into one new website. www.fdmgroup.com and www.fdmacademy.com into a newly designed www.fdmgroup.com. Because the FDM Academy grew as it's own brand we decided not to just forward the domain to the fdmgroup website, but instead just mirror the new FDM Group website and use a canonical tags to the FDM Group domain (so the link juice will pass to the FDM Group domain pages) The website has be live for nearly a month and i don't believe any power has passed down through the FDM Group website to it's deeper pages even though 301 redirects from the legacy group and academy domains in place. I am also seeing the same problem on the FDM Academy domain, but i expect to see this as every page has a canonical to the same page on the Is there anything which is restricting the flow of power through the site, or am i just being impatient. Thanks in advance Jon

    Read the article

  • I need a multi-language site with webshop functionality. Which CMS to choose?

    - by ec30
    I need to develop a multi-language site which includes simple webshop functionality. I have extended experience with WordPress. There are numerous shopping cart plugins available for WordPress however none of them is compatible with multi-language plugins such as WMPL. Drupal is an option I looked into (using i18n and Ubercart) and I am not sure this is the solution I am looking for. Another solution I considered is to develop a custom WordPress cart plugin that is compatible with WPML. Anyone familiar with this situation? Any recommendation regarding CMSes that fit my needs? Thanks!

    Read the article

  • 301 redirect from a country specific domain

    - by Raj
    I originally started using a .do domain extension for my site, but later realized that this country specific domain would prevent us from appearing in search results for places outside of the Dominican Republic. We started using a .co domain extension and redirected all requests to the new domain using an HTTP 301. The "Crawl Stats" in Google Webmaster Tools shows me that the .co domain is being crawled, but the "Index Status" shows the number of pages indexed at 0. The "Crawl Stats" for the .do domain says that it's being crawled and the "Index Status" shows a number greater than 0. I also set a "Change Of Address" in Google Webmaster Tools to have the .do domain point to the new .co domain. We're still not appearing in search results at all even for very specific strings where I would expect to find us. Am I doing something wrong?

    Read the article

  • Framework for interaction between web-page and server-side script

    - by Carrier
    I want to make a web-page that will have several controls elements, among which there are elements like check-boxes, radio-buttons, "range selectors" (one can specify the min and max value, like it is done when you select range for prices in the online markets). The new values shall be sent to the "server-side", once changed (without any Submit buttons etc), and the "server-side" can return something (one or more numbers, etc). Does anyone know a good ajax-like framework that allows (with minimal adaptation / changes) to make such solution in an easy way? It will be good if the server-side of existing solution will be in Perl (not a big deal, but I know it much better than PHP or something else). Set of controls might change and depend on other parameter, so adding one extra element should not cause rewriting the whole thing. P.S.: I haven't working in this area for quite a while, so not aware of existing solutions in this area, and don't want to invent the wheel and write everything from scratch for something that already exist (at least, I hope so). Thanks in advance!

    Read the article

  • How to hide website's real address

    - by Nick
    I'm building a website for public use. It's a sharing website - everyone is allowed to download specific content, but I want to make sure nobody knows where all the files are kept, so I've decided to use URL Forwarding, e.g. when someone visits fakesite.com, it returns realsite.com without revealing/redirecting to realsite.com. Question: I don't know how to make this work. Please help me by explaining how to use URL Forwarding! Thanks!

    Read the article

  • Am I harming myself by having two domains pointing at the same thing?

    - by Earlz
    I have a domain I recently purchased. I went ahead and pointed it at my website(via DNS) and by default, my server now serves my website on this new domain. Eventually, the new domain will replace my old domain(with 302 redirects and all that). However, I've not yet got my website ready for that because I'll need to do some rebranding and such. Am I actively hurting my SEO ratings and such by having these two domains point to the same thing?

    Read the article

  • Best way to setup hosts, subdomains, and IPs [closed]

    - by LynnOwens
    I own a domain, let's call it mydomain.com. I need to host the following off it: forums.mydomain.com www.mydomain.com blog.mydomain.com objects.mydomain.com I believe I can get 5 static IPs. I plan on assigning one each to those four hosts. Then I need to adhoc create names, all below objects.mydomain.com. For instance: one.objects.mydomain.com two.objects.mydomain.com three.objects.mydomain.com I need to create these names programatically, and without human intervention. Preferably, they would not get their own IPs. They would use the IP of objects.mydomain.com. First question: Does this mean that I need to host my own DNS? Second question: I'm using Apache as a web server. What does the virtual host configuration look like? I was experimenting with the following to understand how routing on domain names works and I always ended up at www. <VirtualHost *:80> ServerAdmin [email protected] ServerName www.mydomain.com ServerAlias www.mydomain.com DocumentRoot "E:/Static/www" RewriteEngine On RewriteRule ^(/www/.*) /www$1 </Virtualhost> <VirtualHost *:80> ServerAdmin [email protected] ServerName forums.mydomain.com ServerAlias forums.mydomain.com DocumentRoot "E:/Static/forums" RewriteEngine On RewriteRule ^(/forums/.*) /forums$1 </Virtualhost>

    Read the article

  • How to run/test JavaScript? [closed]

    - by user702
    I'm reading David Flanagan's "JavaScript: The Definitive Guide, 6th ed". It only actually tells users how to run JS code on page 311, where users are told of the following solutions: "Client-side JavaScript code is embedded within HTML documents in four ways: Inline, between a pair of <script> and </script> tags From an external file specified by the src attribute in a <script> tag In an HTML event handler attribute, such as onclick or onmouseover In a URL that uses the special javascript: protocol." I was wondering what professional JS developers use to write and test their code: Do they use a good text editor with syntax high-lighting + autocompletion, hit F5 in the browser to reload the page every time they make a change, and use some add-on in the browser to investigate errors? Or are there full-fledged IDE's similar to MS VisualStudio for non-web languages?

    Read the article

< Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >