Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 322/592 | < Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >

  • I want to create an e-learning website [closed]

    - by Viswa
    I want to create an e-learning website and host it. (Maybe after some time I want to add forms.) These are the things I know: java, jsp, servlet, html (not guru, almost beginner). I don't have experience in creating websites, I did my college project using jsp,servlet and jdbc. What are the things or technology I need to know before creating website. Is it possible to create a website by one person?

    Read the article

  • How do you enhance your websites speed without compromising the design and access?

    - by Thorn007
    How do you enhance your websites load speed without killing the design and accessibility? File compression, CDN, Gzip? What are the best tools for doing so? For example, Google has optimized their site without compromising the design. Also, many website can kill the purity of their images with compression. Is there a way, more or lest best practice, to increase speed without compromising the design and accessibility? Note: sorry for being so vague but I don't know how else to phrase this question.

    Read the article

  • Facebook Share Button and Counter no longer displaying any Count

    - by donaldthe
    Is it just me or did the Facebook Share button that displays the count of shares and likes just stop working over the past few days? The sharing still works, the count of shares no longer displays. The link that is generated look like this http://www.facebook.com/sharer.php?u= and the JavaScript file on my page is this http://static.ak.fbcdn.net/connect.php/js/FB.Share I haven't changed anything and this has worked for years

    Read the article

  • How to redirect a international domain to a subfolder on the English site without hurting Google rankings?

    - by ernest1a
    I have two sites: www.example.de www.main.com www.main.com is English version of www.example.de which is in German. I want to keep only www.main.com. For the English version I will keep www.main.com, but for German I want to move it to www.main.com/de. I am wondering what would be best solution for old www.example.de: Redirect everything from www.example.de to www.main.com/de using 301 redirect? Redirect everything from www.example.de towww.main.com/de/page-url-of-old-size.html? So each link actually get own address. Is that necessary or will Google realize where the page belongs on new site even if I redirect everything to home page? Any other solution, maybe just set in Google webmaster tools the new domain or anything like that?

    Read the article

  • Page views in Google Analytics are off compared to a similar metric

    - by tiki16
    We have a page where a user can sign a pledge to recycle by clicking a pledge button. A script writes it to a text file which updates the number on the page. In the past 2 days there have been 185 pledges signed but only 63 page views in GA. I trust that they are unique pledges and not just people adding multiple fake names and entering it. Is there anyway to get a better report from Google Analytics?

    Read the article

  • Hosting server application for global SME

    - by BBe
    We are planning to set up a complete ERP and CRM system for a medium-sized global company, that might turn into a essential tool for all locations once deployed. For now these locations include USA, Germany, China and Indonesia, but the list is growing quickly. My question is, where it is best to physically locate the server to ensure the access times are optimal from all (future) locations? On my mind, I am dealing with multiple connected servers (a cloud?), where each of our users is served by the physically closest server. Being in a very competitive field we would also like to rule out, that any data is stored in mainland China... Thanks for any advice and pointers!

    Read the article

  • Google Indexing Issue after htaccess changes

    - by Klement
    I have a site called www.FuneralCoverFinder.co.za. I have about 30 pages on the site and usually have 29 indexed. (Excluding 15 blog posts) They are new. I recently upgraded my entire site and made some redirection changes in my .htaccess file. I have made my url's more SEO friendly (Removing index.php/) and redirecting dead pages to working pages. I have tons of unique content all checked by grammarly and plagium to ensure I have no duplicate content. I have since resubmited my sitemap to Google and now have only one page indexed. It was within a couple of minutes. I usually see results almost immediately after submitting, now it's stuck on 1 page indexed. I assume I might have made errors in the .htaccess file as this was my first attempt. The site runs perfectly and all the url's redirect the way they should. I'm scared I have some or other loop, although the website runs fine. I still see many of my old indexed pages in the SERP's, I'm just worried that the issue with the new sitemap can cause my rankings some harm. My website is pretty SEO optimized onsite. I have about 1500 indexed backlinks and have been building them steadily over about half a year. I would really appreciate some clarity on this matter.

    Read the article

  • How to batch remove spamming users and pages they created on MediaWiki?

    - by Problemania
    I'm trying to clean up a MediaWiki instance which has been subjected to spamming and vandalism for a period of time. The current status is that there are a large number of users which only created spam pages but typically not altered legitimate pages. And there is only < 10 users which I know are legitimate users and created a small number of legitimate pages. Abstractly, my idea of fixing the messy situation is to find the complete list of users that are not in that small set of legitimate users, and use RenameUser extension to rename them all to a Spammer user, and use Nuke extension to mass delete all pages it created. Any practical advice on how to proceed? Since there are hundreds of spammer users, how do I effectively rename them? It seems Renameuser extension does not support automated batch renaming of users by allowing users to be renamed with a list or file.

    Read the article

  • Restricting A Directory Through .htaccess

    - by Whitechapel
    I'm trying to put all of my FTP accounts into a folder on /public_html/ftp and password protect it so search bots can't crawl their private files. I'm also trying to redirect all site traffic from the non-www to www. I keep getting 500 errors when accessing the site, and I need to point it to www.vivalanation.com/ftp to www.vivalanation.com/ftp/, because the /ftp just errors out, you need the trailing slash. Here is my .htaccess in the /public_html/ftp folder: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] AuthName "FTP Access" AuthType Basic AuthUserFile /home1/vivalst/.htpasswds/public_html/ftp/passwd Require valid-user I created a passwd file in /.htpasswds/public_html/ftp And here is my basic .htaccess in the root of /public_html/: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

    Read the article

  • List of events triggered on pages matching regex

    - by Cubius
    Is there a way to get the grouped list of events (such as in Top events) which were triggered on pages matching a regular expression? I may add the Page secondary dimension in Top events and apply the regex filter but this way I won't get a grouped list. I may apply the filter to Events - Pages report but this way the events will be grouped only inside pages whilst I need global grouping. Any suggestions?

    Read the article

  • How do I get the root index page to redirect to a subdirectory without affecting SEO?

    - by paradroid
    I am reviving/reorganising my personal WordPress blog. It's using a URL that looks like this: http://mydomain.com/blog The webserver 301 redirects www.mydomain.com to mydomain.com. I want to use the blog subdirectory because I plan to add other parts to the site, with the blog only being one part of the site. However, at the moment there is nothing there but the blog, so I want to have the root index page redirect to the blog for the time being. I have been using this on the root index.html page to do the redirect... <meta http-equiv="REFRESH" content="0;url=./blog"></HEAD> ...but this seemed to have stopped the site being indexed by Google and Bing. How do I do this without affecting SEO? Also, what URL should I put in the sitemap.xml?

    Read the article

  • Where to ask a question about startups?

    - by Wolfpack'08
    I've got some questions about how to better run my web application development business, which has only been running for a little more than two years. It's still in its early phases, as I consider the first five years the 'early years'. Being inexperienced with business in general, I always have a lot of questions as to whether I am making the right decisions (for example, I often worry about my hiring practices, and I often worry that I may have priced new products wrongly). Is there a good site on the Stack Exchange to ask questions about things like this (for example, this site, the Project Management site, the Salesforce site, or perhaps the Personal Finance site)? I'm combing through questions and answers on each of these sites, now, and I can see questions that mimic my own. Nothing precisely the same, but things that are similar on all sites. Apart from just reading through previously asked questions, what is a good way to get a sense of whether or not my question fits on a site in the exchange? If you recommend going out of the exchange, please also let me know.

    Read the article

  • Website (X)HTML Code Change Detection [closed]

    - by 0pt1m1z3
    I am looking for an enterprise-grade service or a tool that can be used to scan / fingerprint websites and notify when major XHTML code changes are detected. The tool should be able to continuously scan thousands of websites and determine the percentage of HTML code that has been modified since the last run. And then either save the data where it can be easily accessed or send periodic notifications. I know of services like ChangeDetect.com, but they don't do markup only changes and instead focus on everything, including content. We don't really care about presentation content, because a lot of sites we need to cover are updated frequently with content.

    Read the article

  • Form development optimization

    - by Juan
    Like many web developers I do forms all the time. I found myself doing the same all the time: placing input fields, assigning a name to each, ajax the form, then create the PHP which involves to assign a PHP var to each $_REQUEST['var'], escape and validate data, build the html and emailing the results. So I found that 70% of the work is duplicated but I just can't duplicate a page and change the fields. I end up wasting more time reformatting, deleting and adding different fields than creating from scratch. I started planing to program a "list of IDs to html+php" converter in which I'd input all the IDs and this would output the basic html and php. Then I thought: there's got to be thousands of developers that go through this, I'd be reinventing the wheel. So this is my question, I'm trying to find that wheel that somebody must have invented already. I found this: http://www.trirand.com/blog/jqform/ which does more or less what I'm looking for but it's an expensive solution and it has too much functionality for what I'd be using it. Which tools do you use to optimize repetitive task about HTML and PHP?

    Read the article

  • PHP efficiency question [closed]

    - by Ron
    Hello everyone. I am working on website and I am trying to make it fast as much as possible - especially the small things that can make my site a little bit quicker. So, my to my question - I got loop that run 5 times and in each time it echo something, If I'll make variable and the loop will add the text I want to echo into the variable and just in the end I'll echo the variable - will it be faster? loop 1 (with the echo inside the loop) for ($i = 0;$i < 5;$i++) { echo "test"; } loop 2 (with the echo outside [when the loop finish]) $echostr = ""; for ($i = 0;$i < 5;$i++) { $echostr .= "test"; } echo $echostr; I know that loop 2 will increase a bit the file size and therefore the user will have to download more bytes but If I got huge loop will it be better to use second loop or not? Thanks.

    Read the article

  • Why Can't Computers Off My Network See the Site? [migrated]

    - by nmagerko
    Have just set up Apache, PHP, MySQL, etc. on my Ubuntu OS, and I was wondering why computers that are not on my network can not see the basic index.html that Apache uses as the default. I set up the static ip address for my computer, and I use 192.168.1.100 for computers to view the simple site. Is there something I am missing that will allow others to access my site? (It is REALLY simple; no graphics, CSS, etc.)

    Read the article

  • Comparisons of Javascript 'data grids'?

    - by Joe
    I've found plenty of questions between here and StackExchange of people asking for the 'best' data grid / data table, or one that has a particular feature, and plenty of lists out there (of various ages) listing the various data grid implementations ... but is anyone aware of any matrix of what features the various solutions implement? (eg, allow shift-click to select multiple; support checkboxes for selection; can update a regular table in-place; allow editing of cells; support websql or indexeddb for local caching; which browsers they support; infinite scroll; etc.) There's a generic 'javascript framework' comparison on wikipedia, which would be the sort of thing I'm looking for, but it doesn't go into detail on data grids. (which makes sense, as so many are extensions, not core features of those frameworks, and in the case of jQuery, there's lots of 'em.)

    Read the article

  • How to run/test JavaScript? [closed]

    - by user702
    I'm reading David Flanagan's "JavaScript: The Definitive Guide, 6th ed". It only actually tells users how to run JS code on page 311, where users are told of the following solutions: "Client-side JavaScript code is embedded within HTML documents in four ways: Inline, between a pair of <script> and </script> tags From an external file specified by the src attribute in a <script> tag In an HTML event handler attribute, such as onclick or onmouseover In a URL that uses the special javascript: protocol." I was wondering what professional JS developers use to write and test their code: Do they use a good text editor with syntax high-lighting + autocompletion, hit F5 in the browser to reload the page every time they make a change, and use some add-on in the browser to investigate errors? Or are there full-fledged IDE's similar to MS VisualStudio for non-web languages?

    Read the article

  • Apress Deal of the day - 5/Feb/2011

    - by TATWORTH
    Today's $10 Deal of the Day from Apress at http://www.apress.com/info/dailydeal is: Pro ASP.NET 4 in C# 2010, Fourth Edition ASP.NET 4 is the latest version of Microsoft's revolutionary ASP.NET technology. It is the principal standard for creating dynamic web pages on the Windows platform. Pro ASP.NET 4 in C# 2010 raises the bar for high-quality, practical advice on learning and deploying Microsoft's dynamic web solution. $59.99 | Published Jun 2010 | Matthew MacDonald I am reviewing this book at the moment but I was already sufficiently impressed by this book to have bought the PDF the day it was available last December.

    Read the article

  • How to interpret Google's "Avg. Page Load Time"?

    - by hawbsl
    Is there any industry rule of thumb for what's considered an unacceptable load time v. an OK one v. a blistering fast one? We're just reviewing some Google Analytics data and getting 0.74 Avg. Page Load Time reported. I guess that's OK. However it would be good if some meatier comparison data were available, or a blog post, or somewhere where there's some analysis of what speeds are generally being achieved by various kinds of sites. Any useful links to help someone interpret these speeds? If you Google it you just get a lot of results dealing with how to improve your speed. We're not at that stage yet.

    Read the article

  • Oracle EZConnect in Mediawiki

    - by raindog308
    Mediawiki supports Oracle and I'm trying to configure it in the installer. The installers says you can use EZConnect...something like: user/pass@//server.example.com/dbname or since the installer has fields elsewhere for user/pass server.example.com/dbname The installer includes a link to the EZConnect docs: http://docs.oracle.com/cd/E11882_01/network.112/e10836/naming.htm. All the examples in that doc include a forward slash. But every combination I've tried results in an error like this: Invalid database TNS "sever.example.com/service_name". Use only ASCII letters (a-z, A-Z), numbers (0-9), underscores (_) and dots (.). I can't find any examples of EZConnect that don't include a forward slash. That error is from Mediawiki, not Oracle. I'm tailing the listener log and there is no connection made - Medaiwiki is returning an error without trying to connect. I'm using php OCI8 with the Oracle instant client. I don't have a tnsnames.ora setup for this client - which is kind of the point of EZConnect. I did write a test php script that connects via oci_connect just fine. Has anyone configured Mediawiki to use Oracle with EZConnect? If so, what did you use in the installer?

    Read the article

  • Will not supporting IE or older browsers drive away potential visitors/users of my site? [closed]

    - by XToro
    Normally a SO browser but this question doesn't fit there, hopefully it fits here. I just want to ask from web designers' point of view if it's wrong to not care about supporting Internet Explorer or older browsers. The site I'm designing looks great in all browsers except IE9-. There are certain things that IE doesn't support or behave like other browsers; webkit stuff, some CSS styles, drop-and-drop files from OS etc etc, but it all works great in Safari, FireFox, Chrome etc. Should I be that concerned? I know there are several people that use IE, but it's limitations have just been causing me more work by having to come up with workarounds. From what I've read, many of the issues I've been having should be solved with IE10, but not everybody keeps up to date. I know of several people who are still using IE6! Again, I'm hoping this is the right place to ask a question like this, and if not, please point me to the right stack exchange site instead of just downvoting me. Thanks! EDIT: Upon further research.... So far this year, IE(all versions) and Chrome have been neck and neck as the top, with IE only squeaking by Chrome, and FireFox a close 3rd. But looking at the top 10 browsers, IE6 doesn't even show up on this list in which the lowest percentage is 1.92%. Source : http://www.w3counter.com/globalstats.php?year=2012&month=7 Having a look at this other site, IE6 shows up in 11th place out of 12, just before "Other" http://www.sitepoint.com/browser-trends-february-2012/ This makes me a little more wary of not spending more time on IE compatibility. However, my site will not be going to a live beta until October or November, and I'm hoping that IE10 will have more features coded into it. Currently, I've written my upload page which is a "drag-and-drop files from the OS" type to simply display "IE is not supported", leaving no other option for IE users to upload pictures because I've spent so much time writing the uploader which does many things other than just upload the files. I will be changing this kinda cold "Access Denied" to a suggestion to upgrade, or install other browsers, with download links for each. Big thanks for the posts here and the interesting links!

    Read the article

  • redirect url ending with dot

    - by Michael
    I submitted my site's URL to my workplace's printed newsletter and when I get the printed version, they added a dot to the end of it. Some people will realize that the period is not a part of the url but others will not. Is there an easy way to redirect from http://example.com/home. to http://example.com/home? I have a IIS 7.0 shared hosting with GoDaddy. This means I have access to the box only through their interface so some options might be limited.

    Read the article

  • How to Keep SEO Score from Dropping with Duplicate Content

    - by joeh0717
    I'm hoping that someone has a solution for what I'm trying to accomplish. I'm working on a travel agency web site and there's a "Overview" section for each cruise line. These overviews are located on the index page for each cruise line. Here's my issue: The company is creating a search engine that includes details on each cruise line. Their write-ups on each cruise line are great, so I'd like to include the overview they created for each cruise line, rather than having to create all new ones. However, I don't want duplicating their content to negatively affect the SEO scores of the pages they originally put this content on. It's gong to duplicate, since each page that's dynamically generated by their search engine is going to include a section about the cruise line (where I'd want to place the overview). Question: Is there any way that I can include these overviews (ideally, copying the exact HTML that they've already implemented) without the search engines indexing those particular code sections? I'd want the rest of the search result pages to be indexed...just not the section of each page that contains this duplicate code. I saw something about using a span class named robots-nocontent in Yahoo (not sure if this also applies to Bing) and googleon / googleoff tags in Google. Is this the best solution? I'm open to any suggestions, thanks!

    Read the article

  • Can I redirect the HTTP request towards an old folder to the homepage using .htaccess file?

    - by AndreaNobili
    I have to following situation: I had an old blog that was made using Joomla (this blog was indexed well enough by search engines). For some problems I delete it and I have create it again using WordPress. Now I have many visit (from Google) that leading to specific pages of the old site (pages that don't exist in the new version). For example I have visit to URL as: /scorejava/index.php/corso-spring-mvc/1-test that don't exist on my new site. I would know if using the .htaccess file (or other sistem) I can redirect the HTTP request directed to some subfolder (that don't exist in the new version) to the homepage of my new site. For example I have the request towards the void URL: /scorejava/index.php/corso-spring-mvc/1-test. And I would create a regular expression that say something like: all the request toward the subfolder corso-spring-mvc (and all it's content file and subfolder) have to be redirected to www.scorejava.com. Is it possible?

    Read the article

< Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >