Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 115/216 | < Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >

  • Need suggestions on how to create a website with an encrypted database.

    - by SFx
    Hi guys, I want to create a website where a user enters content (say a couple of sentences) which eventually gets stored in a backend database (maybe MySQL). But before the content leaves the client side, I want it to get encrypted using something on client like maybe javascript. The data will travel over the web encrypted, but more importantly, will also be permanently stored in the backend database encrypted. Is JavaScript appropriate to use for this? Would 256 bit encryption take too long? Also, how do you query an encrypted database later on if you want to pull down the content that a user may have submitted over the past 2 months? I'm looking for tips, suggestions and any pointers you guys may have in how to go about learning about and accomplishing this. Thanks!

    Read the article

  • Webserver insists on opening "blog1.php" instead of "index.php"

    - by pepoluan
    I'm at my wits' end. I have just ripped out a website and in the process of rebuilding everything. Previously, the 'home page' of the website is a blog, with the address "www.mydomain.com/blog1.php". After exporting everything, I deleted the whole directory, and -- based on request -- immediately create a blog/ directory. The idea is to get the blog back up as soon as possible, and temporarily redirect people accessing www.mydomain.com to the blog. Accessing the blog via http://www.mydomain.com/blog/ works. So I put in an index.php file containing a (temporary) redirect to the blog's address. The problem: The server insists on opening blog1.php instead of index.php. Even after we deleted all the files (including .htaccess). And even putting in a new .htaccess file with the single line of DirectoryIndex index.php doesn't work. The server stubbornly wants blog1.php. Now, the server is actually a webhosting, so I have no actual access to it. I have to do my work via cPanel. Currently, I work around this issue by creating blog1.php; but I really want to know why the server does not revert to opening index.php. Did I perhaps miss some important settings in the byzantine cPanel menu page?

    Read the article

  • A new clients come into my web agency. How to configure email and social accounts to work better? [on hold]

    - by Marco Panichi
    I created websites for many years but still have not found the right way to organize all the email and social accounts of every clients. I mean, every web agency follows dozens of customers. Each client needs at least Google Analytics, AdWords, a Facebook page, a Twitter profile, a Youtube channel, probably a listing on Google Places and maybe a Mail Chimp (or similar) account. The web agency, in my opinion, must own these accounts, use them to deliver results to the customer and -of course- make them available to the customer for two reasons: - The customer must be able to see how things are going - The client must have the ability to change web agency without suffering The web agency, however, has many problems in having all of these accounts. For example, I like the idea of having a Gmail account for each client and from that account use all the products of Google. But is not possible to create more than many Gmail account from the same ip address and with the same phone number. The web agency could invite the customer to create his own accounts but: - This is not necessary a value for the customer (indeed...) - The web agency would manage them, however, from the same ip address, incurring in problems - If phone verification occurs, the web agency has to disturb the customer for verification Have you the same problem? How to solve it?

    Read the article

  • What's the best way to version CSS and JS URLs?

    - by David Eyk
    As per Yahoo's much-ballyhooed Best Practices for Speeding Up Your Site, we serve up static content from a CDN using far-future cache expiration headers. Of course, we need to occasionally update these "static" files, so we currently add an infix version as part of the filename (based on the SHA1 sum of the file contents). Thus: styles.min.css Becomes: styles.min.abcd1234.css However, managing the versioned files can become tedious, and I was wondering if a GET argument notation might be cleaner and better: styles.min.css?v=abcd1234 Which do you use, and why? Are there browser- or proxy/cache-related considerations that I should consider?

    Read the article

  • Is it possible to make CSS-added text searchable by a browser?

    - by Andrew Stacey
    I run a website that uses CSS pseudo classes to insert text here and there. One of them inserts the value of a CSS counter (whereupon it would require considerable re-engineering of the system to do this without CSS text injection). The specific CSS rule is: .num_defn .theorem_label:after { content: " " counter(definition, decimal); counter-increment: definition; } and this converts "Definition" to "Definition 1" (say). However, the injected text is not searchable by the browser. It doesn't see the 1: if I search for "Definition 1" then it doesn't find it, and if I search for "Definition. Whatever the definition text was" then the browser happily highlights the line except for the inserted 1. So if you imagine the bold text as the highlighting, it would look like: Definition 1 . Whatever the definition text was This is not ideal! People like to refer to definitions by their number and to say "Look at Definition 1 on the page XYZ" (and in contexts where hyperlinks are not available - strange, I know, but it does happen). Thus: Is there any way that I, on the server end, can designate the injected text as "searchable"? If not, is there a simple way at the browser end that this can be enabled?

    Read the article

  • How to parse JSON data from web more faster [closed]

    - by Kaidul Islam Sazal
    I have json inventory inventory.json on the server like this: [ { "body" : "SUV", "color" : { "ext" : "White diamond pearl", "int" : "Taupe" }, "id" : "276181", "make" : "Acura", "miles" : 35949, "model" : "RDX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JNBD/001_0292.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.3L DOHC PGM-FI 16-VALVE", "trans" : "Automatic" }, "price" : { "net" : 29488 }, "stock" : "6942", "trim" : "AWD 4dr Tech Pkg SUV", "vin" : "5J8TB2H53BA000334", "year" : 2011 }, { "body" : "Sedan", "color" : { "ext" : "Premium white pearl", "int" : "Taupe" }, "id" : "275622", "make" : "Acura", "miles" : 40923, "model" : "TSX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JMC6/001_1765.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.4L L4 MPI DOHC 16V", "trans" : "Automatic" }, "price" : { "net" : 22288 }, "stock" : "6945", "trim" : "4dr Sdn I4 Auto Sedan", "vin" : "JH4CU2F66AC011933", "year" : 2010 } ] here are two index, There are almost 5000 index like this. I parsed this json like this: var url = "inventory/inventory.json"; $.getJSON(url, function(data){ $.each(data, function(index, item){ //straight-forward loop if(item.year == 2012) { $('#desc').append(item.make + ' ' + item.model + ' ' + '<br/>' + item.price.net + '<br/>' + item.pic[0].full); } }); }); This is working fine.But the problem is that, this searching and fetching process is little bit slow as there are 5000 indexes already and it's increasing day by day. It seems that, it is a straight-forward loop to parse the data and a normal brute-force method. Now I want to know if there any time efiicient way to parse more faster.Any faster method to parse instead of straight-forward loop ?

    Read the article

  • rel="nofollow" SEO impact

    - by Torez
    I saw a technique used where there was a block with three parts: 1. Image (wrapped in an anchor tag) 2. Heading (anchor tag with heading text) 3. Paragraph (regular p tag with synopsis content) e.g. <li class="block"> <a rel="nofollow" class="thumb" href="#"><img src="images/placeholder_service_thumbnail.jpg" alt="" /></a> <a class="h3" href="#"Good SEO Heading</a> <pPellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Vestibulum tortor quam, feugiat vitae, ultricies eget, tempor sit amet, ante. Donec eu...</p> </li> With the image tag there was a rel="nofollow" on the wrapped anchor tag. So the idea is that the users still has the ability to click the image and go to the details page, but the image link does not rank. When users click on the heading text, that is only what ranks for that specific page. Q: Is this the correct approach? Does this even do anything? What is the best practice?

    Read the article

  • Deny access to a folder on hosting server but serve the pages

    - by Sourav
    My hosting server allows to host multiple websites. The directory structure is like this root |_ www.a.com |_ www.b.com |_ www.c.com |_ www.d.com I want to put some PHP files on the www.d.com folder so if some one browse the site from web-browser can get it, but no one can get it's source code [even by loggin in to the root folder]. Is there any way to doing so ? There is a feature called Password protect folder or so, can in help in this case ?

    Read the article

  • Will Tracking Subdomains as Single Entity with Google Analytics Help SEO? [closed]

    - by Sam Gridley
    Possible Duplicate: Does Google Analytics data affect SEO? We have two subdomains, one for our blog and one for our ecommerce store. The blog serves to bring traffic and the store is how we monetize the site. We have them designed to appear as one large site, but I know google sees them as two sites. Here is how the subdomains look: www.example.com (store) blog.example.com (blog) I believe I can configure analytics to use subdomain tracking as explained here: http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=55524 But my question is whether this will cause google to see our 2 subdomains as one larger domain for SEO purposes. In other words, is there any relationship to how you configure google analytics and how google indexes and ranks your website(s) and pages? Is there anything I need to do in anaytics or webmaster tools to make google aware that these two subdomains work together as one website? Thanks! Sam

    Read the article

  • URL Rewrite http to https EXCEPT files in a specific subfolder

    - by BrettRobi
    I am trying to force all traffic on my web site to use HTTPS, using the URL Rewrite 2.0 module added to IIS 7.5. I got that working and now have a need to exclude a couple of pages from using SSL. So I need a rule to rewrite all URL except those referencing this folder to HTTPS. I've been banging my head against the wall on this and am hoping someone can help. I tried creating a rule to match all URL except those in a nossl subfolder as in this example: <rule name="HTTP to HTTPS redirect" enabled="true" stopProcessing="true"> <match url="(/nossl/.*)" negate="true" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="https://{HTTP_HOST}/{R:1}" redirectType="Found" /> </rule> But this doesn't work. Can anyone help?

    Read the article

  • Why my domain redirect on Google Apps is returning 404?

    - by Tom Brito
    I have a configuration in the Google Apps Control Panel (dcc.securepaynet.net) to redirect <tombrito.com to <http://buscatextual.cnpq.br/buscatextual/visualizacv.do?id=K4499244H9. It worked fine until some days ago, but now it's returning 404. If you access tombrito.com you can see the favicon in the title of the browser tab, but the page shows a 404 error. The target page <http://buscatextual.cnpq.br/buscatextual/visualizacv.do?id=K4499244H9 is fine, it's only some problem with my redirect. Any idea what's wrong here?

    Read the article

  • iFrame ads from site to site

    - by user28327
    How I can make the iFrame ads from site A to site B not like this: <iframe src="http://www.site.com"></iframe> Example: site A <script type="text/javascript"><!-- google_ad_client = "ca-pub-3434343507"; /* site */ google_ad_slot = "343435270"; google_ad_width = 728; google_ad_height = 90; //--> </script> <script type="text/javascript" src="http://pagead2.googlesyndication.com/pagead/show_ads.js"> </script> site B --iFrame-- <script type="text/javascript"><!-- google_ad_client = "ca-pub-3434343507"; /* site */ google_ad_slot = "343435270"; google_ad_width = 728; google_ad_height = 90; //--> </script> <script type="text/javascript" src="http://pagead2.googlesyndication.com/pagead/show_ads.js"> </script> --iFrame--

    Read the article

  • How to properly URL/domain forward

    - by NRGdallas
    No clue on a title for this, someone feel free to suggest an edit. I have a client that has a website. He owns around 200 domains, and wants each domain to contain content from the main website. The header, footer, and navigation bars will remain the same for each domain, but the actual page content will vary (obviously duplicate content issues, open to suggestions) He wants each individual page to be its own separate domain, rather than a url within the main domain. (page1.com page2.com etc - NOT site.com/page1.html, however the file is actually hosted at site.com/page1.html - all links will direct to site.com/whatever accordingly) What would be the best place to start reading / learning on how to do this, and what concerns/considerations should be taken into mind?

    Read the article

  • Want to tap into a niche market. Do I create new site or bolt on to existing site?

    - by nitbuntu
    Hi, After a lot of heard work and a few years of perseverance, I'm seeing regular sales on my website which have been steadily growing over the past year. However, the entrepreneur in me wants tap into a niche market which I've become very interested in. It's possible to bolt on this niche onto my existing site as an additional category, without it looking too out of place; my new category of products would also benefit from the ranking my current site gets. The kind of people who would purchase these new niche products, however, are very particular and obsessive about detail. So, for example, many Vegetarians would not eat in KFC even if they were to introduce a new range of Veggie burgers. So, I thought it best to create a new website and since my existing site was created using an 'old-school' shopping cart and there are many more up-to-date, feature-rich, ones available now, I wanted to use a different shopping cart system. My dilemma is that I already have 2 websites (1 b2c and another b2b site) and maintaining a 2nd b2c site would end up vastly increasing my workload and I fear that I would not be able to pay adequate attention to all the sites. Moreover, the additional customer service work (e.g. answering emails from many separate email accounts) could end up being too confusing and difficult to maintain. The easy answer would be to take on an employee, but I'm just not earning enough to justify this yet. If anyone has any tips or experience they'd like to share, which could help me answer this question, I'd be highly grateful.

    Read the article

  • Tracking Redirects Leading to your site

    - by Bill
    Is there a way in which I can find out if a user arrived at my site via a redirect? Here's an example: There are two sites, first.com & second.com. Any request to first.com will do a 302 redirect to second.com. When the request at second.com arrives, is there anyway to know it was redirected from first.com? Note that in this example you have no control over first.com. (In fact, it could be something bad, like kiddieporn.com.) Also note, because it is a redirect, it will not be in the HTTP referrer header.

    Read the article

  • Synchronise Database between servers via php [closed]

    - by Emmanuel
    Hi Guys, I'm needing to synchronise two mysql databases between different servers on a regular basis, by a client-initiated interface. I've been doing it by remote MYSQL connection, and adding the IP of the servers to the whitelist for MYSQL remote connections. Problem is however, that the client has a dynamic IP, so as soon as it changes they can no longer sync. So I'm trying to find an alternative way of synchronising the two databases via some sort of secure php script.

    Read the article

  • Update Google Sitemap for Mobile

    - by dimo414
    I have a series of utilities to generate Google sitemaps for my whole site. These files are massive, and slow to build. We want to start telling Google these pages are mobile-crawl-able too, by adding them to mobile sitemaps, but the documentation is unclear if I need to specify physically different files for my mobile URLs than for my normal ones. If this is my current sitemap: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://mobile.example.com/article100.html</loc> </url> </urlset> Can I simply change it to: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0"> <url> <loc>http://mobile.example.com/article100.html</loc> <mobile:mobile/> </url> </urlset> Or do I need to create new files with the additional markup, alongside my existing files?

    Read the article

  • Not Provide count has reached 80%! But what about remaining 20?

    - by Rajesh Magar
    I am updated with all not-provided reasons as Google has encrypted their all searches, but here is the little question banging again and again in my head. That if all search results has encrypted with HTTPS protocol then how did Google analytics still able to track some of (20%) organic keywords details? I means their still some keywords appreading in my organic keywords section. So how did Google analytics track or bypass that HTTPS thing? Thanks in advance!

    Read the article

  • Webhosting with custom database choice [closed]

    - by churchill614
    Possible Duplicate: How to find web hosting that meets my requirements? I am trying to find somewhere to host a website which uses OrientDB as its database. My budget doesn't stretch to a dedicated server where I can configure everything as I need it. Rather, I am hoping to find somewhere, ideally UK based, that will allow me to install/install for me OrientDB on their server, that is of the normal shared server variety. Is anybody able to point me in a good direction for this please (whilst UK is preferable it is not essential)?

    Read the article

  • ideas for a personal website [on hold]

    - by user1314836
    I am planning to register a personal domain + hosting space to be less dependant on external companies. I would like to know if you could share some ideas of what I could do with my own domain. I have been thinking in some of them... Use my own e-mail (but Google Apps is no longer free...). Share my photos instead of using Dropbox. Receiving big files or many files through anonymous FTP. Occasional backups? (I don't know if my host would let me know use the hosting for personal storage). Any other ideas or comments on the above?

    Read the article

  • Google Analytics - TOS section pertaining to privacy

    - by Eike Pierstorff
    The Google Analytics terms of service does do not allow to track "data that personally identifies an individual (such as a name, email address or billing information), or other data which can be reasonably linked to such information by Google". Does anybody have first-hand knowledge if this includes user ids which cannot be resolved by Google but can be linked to actual persons via an Analytics Users CRM system (e.g. a CRM linked to Analytics via API access) ? I used to think so, but if that where the case many ecommerce implementations would be illegal (since they store transactions id which can be linked to client's purchases). If anybody has insights about the intended meaning of the paragraph (preferably with a reliable source) it would be great if he/she could share :-)

    Read the article

  • Why doesn't Firefox cache my images and CSS

    - by Richard A
    I am using IIS7, I have already set up the following. But when I run Firefox it seems not to cache any of my images even with "remember history" set. <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <staticContent> <clientCache cacheControlCustom="public" cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" /> </staticContent> </system.webServer> </configuration> However when I use Firebug it still points to Firefox not caching images and CSS: public,max-age=604800 Content-Type text/css Content-Encoding gzip Last-Modified Mon, 27 Jun 2011 03:53:22 GMT Accept-Ranges bytes Etag "507968c27d34cc1:0" Vary Accept-Encoding Server Microsoft-IIS/7.5 X-Powered-By ASP.NET Date Mon, 27 Jun 2011 13:06:41 GMT Content-Length 5067 Request Headersview source Host www.xx.com User-Agent Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1 Accept text/css,*/*;q=0.1 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip, deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Referer http://www.xx.com/ Cookie __utma=62996397.135679654.1309106351.1309159743.1309164158.8; __utmz=62996397.1309106351.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utmc=62996397

    Read the article

  • Has Microsoft stopped offering the free Internet Explorer Application Compatibility VPC Image for IE 6 testing?

    - by Paul D. Waite
    For some time now, Microsoft has made available free, stripped-down, time-limited Virtual PC images for testing web apps in older versions of IE. The most recent version is here: http://www.microsoft.com/download/en/details.aspx?id=11575 But the XP VPC image has now expired (14th Aug 2011), meaning one can no longer test IE 6 using this method. Have Microsoft made updated XP VPC images available? If not, have they commented on the situation? Do they provide any alternative method to test web apps in IE 6? Update As noted by @PleaseStand, as of 16th Aug 2011, Microsoft has made updated images available that expire on 17th November 2011.

    Read the article

  • Web Host for Small Rails-based CMS site [closed]

    - by clem
    Possible Duplicate: How to find web hosting that meets my requirements? I am building a site for someone that uses a Rails-based content management system that I built myself. All of the Rails deployment experience I have so far has been over small intranets. I'm looking at web hosts like rackspace, because it seems like they're well-suited for Rails deployment. However, for a site that's not going to have more than a couple of hundred hits a month (if even that), I'm not sure it's necessary. I've also used Dreamhost's Phusion Passenger deployment for small projects before, but it seems barely functional and not well-supported, and I've also used Heroku for deployment, but I think a regular web host may do a little bit better, as they'll need things like Google Apps for Gmail set up. If anyone could provide some guidance on this, I'd greatly appreciate it. I get confused when I see things on rackspace like "1.5c/hour", because I'm not sure how that gets computed.

    Read the article

  • What measures can be taken to increase Google indexing speed for a given newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. How do I get important-page.html indexed as soon as possible after 12:00? Ideally within seconds or minutes. Or more generally: What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

< Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >