Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 186/563 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • Parse text file on click and display

    - by John R
    I am thinking of a methodology for rapid retrieval of code snippets. I imagine an HTML table with a setup like this: one two ... ten one oneTwo() oneTen() two twoOne() twoTen() ... ten tenOne() tenTwo() When a user clicks a function in this HTML table, a snippet of code is shown in another div tag or perhaps a popup window (I'm open to different solutions). I want to maintain only one PHP file named utitlities.php that contains a class called 'util'. This file & class will hold all the functions referenced in the above table (it is also used on various projects and is functional code). A key idea is that I do not want to update the HTML documentation everytime I write/update a new function in utilities.php. I should be able to click a function in the table and have PHP open the utilities file, parse out the apropriate function and display it in an HTML window. Questions: 1) I will be coding this in PHP and JavaScript but am wondering if similar scripts are available (for all or part) so I don't reinvent the wheel. 2) Quick & easy Ajax suggestions appreciated too (probably will use jquery, but am rusty). 3) Methodology for parsing out the functions from the utilities.php file (I'm not to good with regex).

    Read the article

  • Pros and cons of creating a print friendly page to remove the use of pdfs?

    - by Phil
    the company I work for has a one page invoice that uses the library tcpdf. they wanted to do some design changes that I found are just incredibly difficult for setting up in .pdf format. Using html/css I could easily create the page and have it print very nicely, but I have a feeling that I am over looking something. What are the pros and cons of setting up a page just for printing? What are the pros and cons of putting out a .pdf? I could also use the CSS inline so that if they wanted to download it and open it they could.

    Read the article

  • How do I analyze vague Google Analytics data re traffic from Facebook?

    - by user6982
    We have one Facebook fan page and two personal profiles that could be sending traffic and then there are the many facebook pages of friends etc. I am also running an ad campaign from my FB account for my husband's business which has a link from his personal FB profile and his fan page. On Google analytics for his business we get the following referring sites from Facebook: /ajax/emu/end.php which is listed under facebook.com / referral /l.php (which is a not-found page at FB /ajax/emu/end.php which is listed under apps.facebook.com Both of the working links send me to the home page of my profile, which is the account I am working from to create and review the FB ad campaign that we are running. Is this info telling me any useful information at all? Is there a best practice for tracking and analyzing Facebook traffic that is a lot more granular? thanks!

    Read the article

  • Disadvantages of a fake phpMyAdmin honeypot that causes ip blacklisting and robots.txt disallow/exclusion of the honeypot?

    - by Tchalvak
    I'm trying to figure out whether I should set up a honeypot system with a fake phpMyAdmin (site gets hits all the time with people spidering for insecurities with that app). My thought was to create a honeypot php script that would mimic a phpMyAdmin login, and then blacklist ips that hit that url (and aren't already whitelisted). I would then add the appropriate urls to the robots.txt so that spiders that actually respect my robots.txt wouldn't be caught by the blacklist. Are there disadvantages to this approach, do legit robots sometimes not respect robots.txt in certain circumstances, are there any problems with this that I should consider in advance?

    Read the article

  • Robots meta tag with "noimageindex"

    - by jimy
    I have some doubt regarding noimageindex value in meta robots tag. If I add this tag on my page say http://www.example.com/somepage/someaction.php and on that page the images are served from another page say http://www.exampleimg.com. Then will the tag has meaning. I mean to say will the images will be ignored by bots? Or exampleimg is not affected by that tag. And all images will be indexed? Note: We want to stop indexing of the images on that particular page.

    Read the article

  • seo in relation to web-hosting [closed]

    - by jimmy obonyo
    Possible Duplicate: Does changing web hosting server affects SEO page ranking? I have two websites.one of the site though vigorous attempts to search optimize to certain google keywords or even the site name still performs poorly,while the other site does actually perform better and better.the two sites are hosted by different hosting companies...one bytehost.net the other by youhosting.com.So here is my question,does anyone know if there any relation of hosting company with indexing or not, and if there is a relationship how to choose a good company to get better seo indexing ,rating

    Read the article

  • What options are there for integrating with payment gateways? [closed]

    - by Rowland Shaw
    It seems that there are only two types of payment gateway service out there at the moment; Either that the entire cart logic is handled offsite (with something like Paypal's Standard option) or the other option being that you need to go through the certification for handling credit card numbers and doing pretty much everything yourself. Ideally, for the project I'm working on, I'm after a bit of middle ground such that I can handle the cart on-site, and only pass over to a payment gateway (with an order amount, billing & delivery details, and order ref) for them to handle the card details, before passing back. I'm sure that I've used e-commerce sites using this pattern before, but I cannot find any payment providers out there that offer this sort of option, so are there any? The only over requirement we have at present is that it must accept orders in Sterling.

    Read the article

  • Bookmark login_email at new PayPal URL [closed]

    - by Jonna Stevens
    I have used this Bookmark in Firefox so that my email would be autofilled and I only had to write in my password. PayPal has recently changed its login URL. Has anybody figured out a method to achieve this with the new URl ? Old URL: https://www.paypal.com/es/cgi-bin/webscr?cmd=_login-run&login_email=myemail%40myemail.com New URL (not working): https://www.paypal.com/es/webapps/mpp/home-merchant?login_email=myemail%40myemail.com

    Read the article

  • Multiple TOC with MediaWiki using section headings in single page

    - by user1704043
    I'm running my own installation of MediaWiki, which has been great! I haven't been able to find the answer to this small problem in any post, how to, etc. Here's the setup: Article TOC (limited to showing only H1 and H2) ==H1== ===H2=== ====H3==== ====H3==== I don't want the H3 to show up on the main table of contents, because it would make the list very long. Instead, under the H2, I would like to display another TOC with all the H3's under that listing. From my understanding, you cannot have multiple table of contents on a single page. I've thought about making a template for each H2 that has the H3 links, but that seems like it duplicates a lot of work and creates loads of pages. I'd love a template that sucks all subsection names and spits them out, but I don't see how to do that. Alternatively, is there a way to enable multiple TOCs in a custom install of MediaWiki that I'm missing? Even that would get closer to what I'm trying to do.

    Read the article

  • How do I use IIS7 rewrite to redirect requests for (HTTP or HTTPS):// (www or no-www) .domainaliases.ext to HTTPS://maindomain.ext

    - by costax
    I have multiple domain names assigned to the same site and I want all possible access combinations redirected to one domain. In other words whether the visitor uses http://domainalias.ext or http://www.domainalias.ext or https://www.domainalias3.ext or https://domainalias4.ext or any other combination, including http://maindomain.ext, http://www.maindomain.ext, and https://www.maindomain.ext they are all redirected to https://maindomain.ext I currently use the following code to partially achieve my objectives: <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="CanonicalHostNameRule" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTP_HOST}" pattern="^MAINDOMAIN\.EXT$" negate="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="https://MAINDOMAIN.EXT/{R:1}" /> </rule> <rule name="HTTP2HTTPS" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTPS}" pattern="off" ignoreCase="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="https://MAINDOMAIN.EXT/{R:1}" /> </rule> </rules> </rewrite> </system.webServer> </configuration> ...but it fails to work in all instances. It does not redirect to https://maindomain.ext when user inputs https://(www.)domainalias.ext So my question is, are there any programmers here familiar with IIS7 ReWrite that can help me modify my existing code to cover all possibilities and reroute all my domain aliases, loaded by themselves or using www in front, in HTTP or HTTPS mode, to my main domain in HTTPS format??? The logic would be: if entire URL does NOT start with https://maindomain.ext then REDIRECT to https://maindomain.ext/(plus_whatever_else_that_followed). Thank you very much for your attention and any help would be appreciated. NOTE TO MODS: If my question is not in the correct format, please edit or advise. Thanks in advance.

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Marketing for Scheduled Online Events

    - by JT703
    Last year I started working with a team on our first major web project (We, the Pixels). I believe the idea is very solid, but it has a hard requirement for a group of people being on the site for the randomly scheduled events. We are having problems getting people to come and stay for these events. What is the proper marketing approach needed to bring people to the site for these events? We have recently done the following in an attempt to fix the problem: Added email notification of new events being created Added privileges based on rank Added text throughout the site encouraging setting up the events in the future so other users can have time see that it exists. Gotten involved in with other communities that would find the site interesting in order to promote (market) the site Advertised using Google Adwords Is there an standard marketing approach for such a case as this?

    Read the article

  • Which is better for search engines, repeated phrases or different phrases with the same meaning?

    - by George Botros
    When I'm designing an ads website I have two options: Let the advertiser to choose from some predefined lists to create the new ad. For Example: product list ( T-Shirt, Shorts, Suit, .....) Color list ( Black, Red, .....) Let the advertiser to write his own descriptive content for the product For Example "Amazing suit with a good price" I like the first Scenario but which is better for search engine optimization [SEO], repeated phrases or different phrases with the same meaning? Note : assuming each page will contain one or more ads

    Read the article

  • when google search gives incorrect results - how can it be reported?

    - by vgv8
    If google search query results are incorrect: How can it be reported? What is the procedure to correct it? @Lèse majesté: Incorrect results are the results that do not contain any of the searched keywords in them like in this my question @John Conde, yes I believe it is the right defitition. @DisgruntedGoat, even when there are a lot of results by keycaptcha for "Past 24 hours", the Google.com presents results only on reCAPTCHA. Anyway, they are different from those by google if to search by "keycaptcha" (in quotes) and by other search engines. Everybody thinks that searches by one keyword should be sneakily substituted by google's own promoted brand products?

    Read the article

  • configure open_basedir under Plesk

    - by cori
    This might be a question for ServerFault, and if it wasn't for the Plesk aspect I would ask it there to start with, so if it's better suited for over there let me know and I'll move it. I'm working on a dedicated server set up as a reseller account with Plesk to manage the domains and server configuration, and I need to add a directory to the local open_basedir configuration for a specific vhost. Given Plesk's normal methodology, I expected to be able to go to /var/www/vhost/{%DOMAINNAME%}/conf and modify vhost.conf and place a new value there, as I have successfully done with other configuration settings for this domain (turning safe_mode off, for instance). When I do so, however, the new setting doesn't take (per phpinfo();). If I edit httpd.conf (which the plesk configuration specifically says not to do in the notes at the top of httpd.conf) the setting takes. Is there something specific about the open_basdir setting that makes it not configurable in vhost.conf? How much trouble am I letting myself in for by editing the vhost-specific httpd.conf (I imagine is someone makes changes in the plesk web interface it might be overwritten, but what other risk is there)? Thanks!

    Read the article

  • Looking for full web based booking system

    - by Sandro Dzneladze
    I've searched around here but could not find any booking system that would suit my need. I hope maybe you can help me out. I've already seen question #11379 over here. Solutions provided there are not helping and scripts not interesting. I'm looking for a booking system that support infinite number of hotels/rooms/pricingschemes. There should be unlimited number of hotel owner/users, who can access control panel and set availability for their hotel. Or create seasonal prices and adjust them. I should be able to have 15% booking fee paid through credit cart. I can work on integration with my system, it just needs to have correct APi/functionality to support this. I love wordpress so if there are some nice plugins for that, I'm open for suggestions. But this is not a must. It should be php/mysql based as I'm not good at anything else :)

    Read the article

  • Buy internet country domain name: .fr .co.uk .de .com.au .sg etc

    - by user700580
    I already bought a domain name .com on godaddy for my company. I would like to reserve the same name with country specific domain extention, but not sure where to buy them and how to do it. Here are the ones that I would like to buy: Europe: fr, co.uk, de, ch, es, it, nl, se, no, ru australia: com.au asia: sg Godaddy has all except 1 in europe, australia and singapore. Should I find a website that sell all of them or should I buy some of them in godaddy and others elsewhere? Any suggestions where to buy them? Until now i've always buy .com domain names only so not sure how to do it. Thanks

    Read the article

  • getting the user back where they came from with mod_form_auth

    - by bmargulies
    Using the mod_form_auth module in Apache HTTPD 2.4.3, I am looking for a way to have the user redirected to their original desired target after completing a login. That is, if I have a <Location /protected> ... form auth config here </Location> the user might browse to /protected/a, or to protected/b. In either case, they will be presented with the login form. However, as far as I can see, I must specific a single 'success' URL. I'm wondering if I'm missing some Apache feature that would allow me to, for example, cause the redirect to the login form go to something like: https://login.html?origTarget=/protected/a via some syntax on the AuthForLoginRequiredLocation statement?

    Read the article

  • New site not appearing in index after change of address, no feedback from google webmaster tools

    - by Duffy
    Our change of address seems to not be taking effect. Here's the story so far: We're a web company and our product is called The New Hive. Our site used to be at thenewhive.com, but we decided to switch to newhive.com (drop the "the", it's cleaner). So the timeline of what I've tried, starting on July 29th: used 301 redirects for all pages (e.g. thenewhive.com/tag/art = newhive.com/tag/art) At this point we noticed that we had disappeared from search results when searching "The New Hive", the front page used to be all links to our site plus a couple news articles about the company. So on August 5th I: verified new domain in webmaster tools (old domain was already verified) submitted a change of address request on August 5th with Webmaster Tools / Configuration / Change of Address Then after another week, on August 13th I did this: Went to Webmaster Tools / Health / Fetch as google fetched our homepage and a couple sub pages, all successfully clicked "Submit to Index" for homepage As of today (August 23rd) we're still not showing up in the index. We're getting no warnings or feedback of any kind from the dashboard so I'm inclined to think something's broken with the dashboard rather than that something's wrong with our site from an SEO perspective. From the dashboard: No new messages or recent critical issues. Crawl Errors: No data available. From Health - Index Status: Total indexed 0 Ever crawled 42,490 Not selected 12 Blocked by robots 0 I'm really at a loss here, any help would be appreciated.

    Read the article

  • Synchronizing audio with scrolling text

    - by mr yoshida
    I am trying to have a website that vertically scrolls about 5 paragraphs of text with a matching audio file that reads along with it. It doesn't need to be synchronized word for word such as highlighting each spoken word but an accurate start and stop time. I've searched for quite a bit on the most efficient way of doing this but can't seem to find any answers. I tried Flash but really don't want to use it. Thanks in advance.

    Read the article

  • Http header 302 error

    - by Katherine Katie
    Response Headers status HTTP/1.1 302 Found connection close pragma no-cache cache-control no-cache location / location /NKiXN/ I don't know how it became so but i used w3 total cache plugin but at this time i have deactivated it please help me to solve it, site is coming down in search engine ranking and google bot is unable to follow it. Urgent help required, if this is configuration problem with server please let me know the solution. Site: http://onlinecheapestcarinsurance.co.uk/

    Read the article

  • Layout Columns - Equal Height

    - by Kyle
    I remember first starting out using tables for layouts and learned that I should not be doing that. I am working on a new site and can not seem to do equal height columns without using tables. Here is an example of the attempt with div tags. <div class="row"> <div class="column">column1</div> <div class="column">column2</div> <div class="column">column3</div> <div style="clear:both"></div> </div> Now what I tried with that was doing making columns float left and setting their widths to 33% which works fine, I use the clear:both div so that the row would be the size of the biggest column, but the columns will be different sizes based on how much content they have. I have found many fixes which mostly involve css hacks and just making it look like its right but that's not what I want. I thought of just doing it in javascript but then it would look different for those who choose to disable their javascript. The only true way of doing it that I can think of is using tables since the cells all have equal heights in the same row. But I know its bad to use tables. After searching forever I than came across this: http://intangiblestyle.com/lab/equal-height-columns-with-css/ What it seems to do is exactly the same as tables since its just setting its display exactly like tables. Would using that be just as bad as using tables? I honestly can't find anything else that I could do. edit @Su' I have looked into "faux columns" and do not think that is what I want. I think I would be able to implement better designs for my site using the display:table method. I posted this question because I just wasn't sure if I should since I have always heard its bad using tables in website layouts.

    Read the article

  • Google page rank not showing after redirecting www to non www?

    - by muhammad usman
    i have a fashion website. i had redirected my domain htttp:// (non www) to http:// www domain and my preferred domain in Google webmaster tools was http:// www. Now i have redirected http:// www to http:// domain and have changed my prefered domain as well. Now Google PageRank is not showing for even a single page. Would any body please help me and let me know if i have done something wrong? below is my htaccess redirect code RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] RewriteCond %{HTTP_HOST} ^www\.deemasfashion\.com$ RewriteRule ^deemasfashion\.com/?(.*)$ http://deemasfashion.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://deemasfashion.com/ [R=301,L] RewriteRule ^index\.htm$ http://deemasfashion.com/ [R=301,L]

    Read the article

  • average screen ratio

    - by sam
    Im building a portfolio website that uses full screen background images slideshow that are cropped to fit using a js plugin. To give the minimum amount of cropping whats the best ratio to make the images ? ie i know 13" macbooks are around 13:7 (when taking into account about 100px for the browser bar) but does that scale up on 15",24",17" displays ? I know there are charts showing the most common dimensions but they just show a range of sizes and thats categorized by groups rather than actual dimensions

    Read the article

  • Gzip compress offline?

    - by shoosh
    I've configured my site to serve compressed content by putting this line in .htaccess AddOutputFilterByType DEFLATE text/html text/plain text/xml text/javascript text/css application/javascript application/json This works perfectly for almost all files except a few large JSON files that are above 200Kb. For some reason they are not being compressed. I see that they don't using the net tab in firebug and the Network section in chrome. So as a workaround I thought I could compress these files offline and have Apache read them compressed. What tool should I use to compress them? is the linux gzip the one? any special flags or something I should use? What should I put in .htaccess so that the server would know to serve these files with content-encoding gzip ?

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >