Search Results

Search found 21350 results on 854 pages for 'url parsing'.

Page 334/854 | < Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >

  • Kickstarting VMWare ESX 4.1 (Error: No NIC with name bootif)

    - by William
    I'm having an issue kickstarting an installation of VMWaare ESX Classic 4.1. I've stripped down my kickstart a bit to just: accepteula keyboard us auth clearpart --firstdisk --overwritevmfs url --url=10.16.0.1/cblr/ks_mirror/esx-classic-4.1.0-260247 rootpw --iscrypted $1$zZJa3g7g$mD8d.6QgbPku1QovQTAps/ timezone 'US/Pacific' network --addvmportgroup=true --device=vmnic0 --bootproto=dhcp part '/boot' --fstype=ext3 --size=1100 --onfirstdisk part 'none' --fstype=vmkcore --size=110 --onfirstdisk part 'datastore1' --fstype=vmfs3 --size=8920 --grow --onfirstdisk virtualdisk 'esxconsole' --size=7920 --onvmfs='datastore1' part 'swap' --fstype=swap --size=916 --onvirtualdisk='esxconsole' part '/var/log' --fstype=ext3 --size=2000 --onvirtualdisk='esxconsole' part '/' --fstype=ext3 --size=5000 --grow --onvirtualdisk='esxconsole' %post --interpreter=bash However, when I attempt to use this kickstart during a PXE install with no additional kernel options, I get the following error: There was a problem with the Network Device specified on the command line. Error: No NIC found with name bootif If I comment out the network line in the kickstart, the error changes to: There was a problem with the Network Device specified on the command line. Error: No NIC found with name eth0 How can I fix this? Thanks.

    Read the article

  • Setting alias for DynDNS domain

    - by metalball
    Hey all, I've created DynDNS domain for testing my local sites, and i'm having trouble with pointing to root domain. From my registrar (GoDaddy) I've created a CNAME for www to point my example.dyndns.com so going to url www.example.com I'm reaching my site. But if I'm going to example.com I'm reaching to the IP of the A record. I can't set the IP for the A record to be my IP because I have dynamic IP, and it changes constatly, and I can't point the A record to domain, only IP. When trying to create CNAME record @ to point example.dyndns.com I'm getting error "A record of a different type exists for the hostname @, could not create CNAME" The only record using the '@' host are NS record, which I can't delete, and when tried to set another NS record with @ point to example.dyndns.com, I've lost connection to my site :) So what can I do to get example.com url reach my site? Thanx!

    Read the article

  • SEO - Google and link cleaning / cloaking [closed]

    - by Jens Törnell
    Possible Duplicate: Does the Google spider render JavaScript? This a SEO related question, not a code related one. Googles own link cleaning / cloaking Gå to http://www.google.com and search for something. Hover the title and you will se a link to the page you want to go to. The URL you see when hovering is NOT the link you are clicking on. Instead of clicking you can drag the title a little bit and then hover it. Then you will se the real URL. My own link cleaning / cloaking Go to http://jsfiddle.net/NvmER/1/ and click the link, or look at the code below. You will be "redirected" to http://www.test.com. The real link are http://www.test.com/?event=23 Working code in case jsfiddle don't work If you need to se how it works I pasted a code below. <a class="direct" href="http://www.test.com/?event=23" data-redirect="http://www.test.com">Länk</a>? $(document).ready(function() { $("a.direct").live("mousedown", function(e){ var oldurl = $(this).attr('href'); var newurl = $(this).attr('data-redirect'); $(this).attr('href', newurl); }); });? Question Is this ok with Google? It's done with javascript. If you have an answer, link to a source or test to support it.

    Read the article

  • Multi MVC processing vs Single MVC process

    - by lordg
    I've worked fairly extensively with the MVC framework cakephp, however I'm finding that I would rather have my pages driven by the multiple MVC than by just one MVC. My reason is primarily to maintain an a more DRY principle. In CakePHP MVC: you call a URL which calls a single MVC, which then calls the layout. What I want is: you call a URL, it processes a layout, which then calls multiple MVC's per component/block of html on the page. When you compare JavaScript components, AJAX, and server side HTML rendering, it seems the most consistent method for building pages is through blocks of components or HTML views. That way, the view block could be situated either on the server or the client. This is technically my ONLY disagreement with the MVC model. Outside of this, IMHO MVC rocks! My question is: What other RAD frameworks follow the same principles as MVC but are driven rather by the View side of MVC? I've looked at Django and Ruby on Rails, yet they seems to be more Controller driven. Lift/Scala appears to be somewhat of a good fit, but i'm interested to see what others exist.

    Read the article

  • Operation MVC

    - by Ken Lovely, MCSE, MCDBA, MCTS
    It was time to create a new site. I figured VS 2010 is out so I should write it using MVC and Entity Framework. I have been very happy with MVC. My boss has had me making an administration web site in MVC2 but using 2008. I think one of the greatest features of MVC is you get to work with root of the app. It is kind of like being an iron worker; you get to work with the metal, mold it from scratch. Getting my articles out of my database and onto web pages was by far easier with MVC than it was with regular ASP.NET. This code is what I use to post the article to that page. It's pretty straightforward. The link in the menu is passes the id which is simply the url to the page. It looks for that url in the database and returns the rest of the article.   DataResults dr = new DataResults(); string title = string.Empty; string article = string.Empty; foreach (var D in dr.ReturnArticle(ViewData["PageName"].ToString())) { title = D.Title; article = D.Article; } public   List<CurrentArticle> ReturnArticle(string id) { var resultlist = new List<CurrentArticle>(); DBDataContext context = new DBDataContext(); var results = from D in context.MyContents where D.MVCURL.Contains(id) select D;foreach (var result in results) { CurrentArticle ca = new CurrentArticle(); ca.Title = result.Title; ca.Article = result.Article; ca.Summary = result.Summary; resultlist.Add(ca); } return resultlist;}

    Read the article

  • 500 Internal Server Error with PHP application

    - by James
    I have written a PHP application using Windows and XAMPP. I've been trying to run it on Ubuntu 10.10 with Lighttpd 1.4.26. Parts of the application work fine, but whenever I try to log in, I get a 500 - Internal Server Error page. The only thing that shows up in /var/log/lighttpd/error.log is 2011-02-25 13:43:13: (mod_fastcgi.c.2582) unexpected end-of-file (perhaps the fastcgi process died): pid: 1169 socket: unix:/tmp/php.socket-0 2011-02-25 13:43:13: (mod_fastcgi.c.3367) response not received, request sent: 1596 on socket: unix:/tmp/php.socket-0 for /~denton/customer-facing-portal/index.php?, closing connection If I had any output whatsoever from PHP, this would be a lot easier to debug. Any ideas on how to get some? Here is my /etc/lighttpd/lighttpd.conf file: # Debian lighttpd configuration file # ############ Options you really have to take care of #################### ## modules to load server.modules = ( "mod_alias", "mod_compress", # "mod_rewrite", # "mod_redirect", # "mod_usertrack", # "mod_expire", # "mod_flv_streaming", # "mod_evasive", "mod_setenv" ) ## a static document-root, for virtual-hosting take look at the ## server.virtual-* options server.document-root = "/var/www/" ## where to upload files to, purged daily. server.upload-dirs = ( "/var/cache/lighttpd/uploads" ) ## where to send error-messages to server.errorlog = "/var/log/lighttpd/error.log" ## files to check for if .../ is requested index-file.names = ( "index.php", "index.html", "index.htm", "default.htm", "index.lighttpd.html" ) ## Use the "Content-Type" extended attribute to obtain mime type if possible # mimetype.use-xattr = "enable" ## # which extensions should not be handle via static-file transfer # # .php, .pl, .fcgi are most often handled by mod_fastcgi or mod_cgi static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" ) ######### Options that are good to be but not neccesary to be changed ####### ## Use ipv6 only if available. (disabled for while, check #560837) #include_shell "/usr/share/lighttpd/use-ipv6.pl" ## bind to port (default: 80) # server.port = 81 ## bind to localhost only (default: all interfaces) ## server.bind = "localhost" ## error-handler for status 404 #server.error-handler-404 = "/error-handler.html" #server.error-handler-404 = "/error-handler.php" ## to help the rc.scripts server.pid-file = "/var/run/lighttpd.pid" ## ## Format: <errorfile-prefix><status>.html ## -> ..../status-404.html for 'File not found' #server.errorfile-prefix = "/var/www/" ## virtual directory listings dir-listing.encoding = "utf-8" server.dir-listing = "enable" ### only root can use these options # # chroot() to directory (default: no chroot() ) #server.chroot = "/" ## change uid to <uid> (default: don't change) server.username = "www-data" ## change gid to <gid> (default: don't change) server.groupname = "www-data" #### compress module compress.cache-dir = "/var/cache/lighttpd/compress/" compress.filetype = ("text/plain", "text/html", "application/x-javascript", "text/css") #### url handling modules (rewrite, redirect, access) # url.rewrite = ( "^/$" => "/server-status" ) # url.redirect = ( "^/wishlist/(.+)" => "http://www.123.org/$1" ) #### expire module # expire.url = ( "/buggy/" => "access 2 hours", "/asdhas/" => "access plus 1 seconds 2 minutes") #### external configuration files ## mimetype mapping include_shell "/usr/share/lighttpd/create-mime.assign.pl" ## load enabled configuration files, ## read /etc/lighttpd/conf-available/README first include_shell "/usr/share/lighttpd/include-conf-enabled.pl" ## Set environment variables setenv.add-environment = ( "DB_URL__DEMO" => "192.168.1.231", "DB_NAME_DEMO" => "demo", "DB_USER_DEMO" => "user", "DB_PASS_DEMO" => "password", "DB_AGENCY_DEMO" => "demo" ) Here is my /etc/php5/cgi/php.ini file (sans 1641 lines of comments): [PHP] register_long_arrays = Off short_open_tag = Off engine = On short_open_tag = Off asp_tags = Off precision = 14 y2k_compliance = On output_buffering = 4096 zlib.output_compression = Off implicit_flush = Off unserialize_callback_func = serialize_precision = 100 allow_call_time_pass_reference = Off safe_mode = Off safe_mode_gid = Off safe_mode_include_dir = safe_mode_exec_dir = safe_mode_allowed_env_vars = PHP_ safe_mode_protected_env_vars = LD_LIBRARY_PATH disable_functions = disable_classes = expose_php = On max_execution_time = 30 max_input_time = 60 memory_limit = 128M error_reporting = E_ALL & ~E_DEPRECATED & ~E_STRICT display_errors = On display_startup_errors = On log_errors = On log_errors_max_len = 1024 ignore_repeated_errors = Off ignore_repeated_source = Off report_memleaks = On track_errors = On html_errors = On variables_order = "GPCS" request_order = "GP" register_globals = Off register_long_arrays = Off register_argc_argv = Off auto_globals_jit = On post_max_size = 8M magic_quotes_gpc = Off magic_quotes_runtime = Off magic_quotes_sybase = Off auto_prepend_file = auto_append_file = default_mimetype = "text/html" doc_root = user_dir = enable_dl = Off cgi.fix_pathinfo=1 file_uploads = On upload_max_filesize = 2M max_file_uploads = 20 allow_url_fopen = On allow_url_include = Off default_socket_timeout = 60 [Date] date.timezone = "America/Chicago" [filter] [iconv] [intl] [sqlite] [sqlite3] [Pcre] [Pdo] [Pdo_mysql] pdo_mysql.cache_size = 2000 pdo_mysql.default_socket= [Phar] [Syslog] define_syslog_variables = Off [mail function] SMTP = localhost smtp_port = 25 mail.add_x_header = On [SQL] sql.safe_mode = Off [ODBC] odbc.allow_persistent = On odbc.check_persistent = On odbc.max_persistent = -1 odbc.max_links = -1 odbc.defaultlrl = 4096 odbc.defaultbinmode = 1 [Interbase] ibase.allow_persistent = 1 ibase.max_persistent = -1 ibase.max_links = -1 ibase.timestampformat = "%Y-%m-%d %H:%M:%S" ibase.dateformat = "%Y-%m-%d" ibase.timeformat = "%H:%M:%S" [MySQL] mysql.allow_local_infile = On mysql.allow_persistent = On mysql.cache_size = 2000 mysql.max_persistent = -1 mysql.max_links = -1 mysql.default_port = mysql.default_socket = mysql.default_host = mysql.default_user = mysql.default_password = mysql.connect_timeout = 60 mysql.trace_mode = Off [MySQLi] mysqli.max_persistent = -1 mysqli.allow_persistent = On mysqli.max_links = -1 mysqli.cache_size = 2000 mysqli.default_port = 3306 mysqli.default_socket = mysqli.default_host = mysqli.default_user = mysqli.default_pw = mysqli.reconnect = Off [mysqlnd] mysqlnd.collect_statistics = On mysqlnd.collect_memory_statistics = Off [OCI8] [PostgresSQL] pgsql.allow_persistent = On pgsql.auto_reset_persistent = Off pgsql.max_persistent = -1 pgsql.max_links = -1 pgsql.ignore_notice = 0 pgsql.log_notice = 0 [Sybase-CT] sybct.allow_persistent = On sybct.max_persistent = -1 sybct.max_links = -1 sybct.min_server_severity = 10 sybct.min_client_severity = 10 [bcmath] bcmath.scale = 0 [browscap] [Session] session.save_handler = files session.use_cookies = 1 session.use_only_cookies = 1 session.name = PHPSESSID session.auto_start = 0 session.cookie_lifetime = 0 session.cookie_path = / session.cookie_domain = session.cookie_httponly = session.serialize_handler = php session.gc_probability = 1 session.gc_divisor = 1000 session.gc_maxlifetime = 1440 session.bug_compat_42 = Off session.bug_compat_warn = Off session.referer_check = session.entropy_length = 0 session.cache_limiter = nocache session.cache_expire = 180 session.use_trans_sid = 0 session.hash_function = 0 session.hash_bits_per_character = 5 url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=fakeentry" [MSSQL] mssql.allow_persistent = On mssql.max_persistent = -1 mssql.max_links = -1 mssql.min_error_severity = 10 mssql.min_message_severity = 10 mssql.compatability_mode = Off mssql.secure_connection = Off [Assertion] [COM] [mbstring] [gd] [exif] [Tidy] tidy.clean_output = Off [soap] soap.wsdl_cache_enabled=1 soap.wsdl_cache_dir="/tmp" soap.wsdl_cache_ttl=86400 soap.wsdl_cache_limit = 5 [sysvshm] [ldap] ldap.max_links = -1 [mcrypt] [dba] Update: here is /etc/lighttpd/conf-enabled/15-fastcgi-php.conf As far as I know, it's just the default config file the Ubuntu package installed. ## FastCGI programs have the same functionality as CGI programs, ## but are considerably faster through lower interpreter startup ## time and socketed communication ## ## Documentation: /usr/share/doc/lighttpd-doc/fastcgi.txt.gz ## http://redmine.lighttpd.net/projects/lighttpd/wiki/Docs:ConfigurationOptions#mod_fastcgi-fastcgi ## Start an FastCGI server for php (needs the php5-cgi package) fastcgi.server += ( ".php" => (( "bin-path" => "/usr/bin/php-cgi", "socket" => "/tmp/php.socket", "max-procs" => 1, "idle-timeout" => 20, "bin-environment" => ( "PHP_FCGI_CHILDREN" => "4", "PHP_FCGI_MAX_REQUESTS" => "10000" ), "bin-copy-environment" => ( "PATH", "SHELL", "USER" ), "broken-scriptfilename" => "enable" )) )

    Read the article

  • How to efficiently permanently redirect 150.000 images?

    - by Fabio Spampinato
    For SEO purposes I need to rename around 150.000 images, then I'd like to permanently redirect the previous url locations requests to the new locations. The current url to every image is something like: website.com/something/unique_id/filename.jpg And I want to redirect them to: website.com/something/unique_id/new_filename.jpg I can only think about 2 options: 1) Create an enormous list of redirects to include into my nginx's conf file. 2) Redirect those requests to something like "website.com/new_location/unique_id" that will redirect the request again to the new path. There are other, better, options? Should I avoid multiple 301 redirects? Will crawlers downgrade my rankings because of multiple redirects?

    Read the article

  • Value of the HTML5 lang attribute

    - by user359650
    I'm working on a website which will offer localized content following the language+region approach as described on this W3.org page (e.g. fr-CA for Canadian French content, and fr-FR for "French French" content). As we consider content for each language+region to be unique, it is crucial to us that search engines properly identify and serve the content accordingly. By looking up on the Internet (e.g. this question), it appears that most people recommend the use of an ISO639 language code in the HTML lang attribute to describe the content language. Following this recommendation, we would en up using <html lang="fr"> which wouldn't enable the differentiation between the aforementioned language+region combinations. When reviewing the HTML4 specification, it seems that using language+region as a language code would be perfectly OK, as the en-US example is given as one possible value. However I couldn't find any confirmation of this in the HTML5 specification which doesn't seem to provide any example as to the possible allowed values. From there I tried to get a de facto answer by looking at what the web giants are doing. I looked at what Facebook are doing: they offer Candian French and French French versions of their websites with (slightly) different content, whilst the HTML lang value remains the same: fr-CA URL: http://fr-ca.facebook.com HTML lang attribute: <html lang="fr"> translation of the word 'email': courriel fr-FR URL: http://fr-fr.facebook.com/ HTML lang attribute: <html lang="fr"> translation of the word 'email': Adresse électronique Q: What is the recommended/standard way of describing content that was localized using the language+region approach in HTML5 ?

    Read the article

  • How can I enable http auth in lighttpd for all directories except one?

    - by Nuri Hodges
    I am trying to authenticate access to everything in webroot (/) except anything that resides in a particular directory (/directory/) and I've tried both of these options to no avail: $HTTP["url"] =~ "^(?!(/directory))" { auth.require = ( "" => ( "method" => "basic", "realm" => "auth to this area", "require" => "user=username" ) ) } $HTTP["url"] != "/directory" { auth.require = ( "" => ( "method" => "basic", "realm" => "auth to this area", "require" => "user=username" ) ) }

    Read the article

  • International Dvorak keyboard doesn't trigger hot-keys

    - by akurtser
    Hi, I just configured a Hebrew-Dvorak keyboard with Keyboard Layout Manager. What this means is that when I hold Shift+any key, the input is the capital English letter. It works just fine, however, hot-keys that involves the Ctrl key aren't being triggered when the input language is set to Hebrew (i.e Ctrl+L in FF which should set the focus on the URL bar) If however I hit the matching Qwerty key (input lang=Hebrew) I get the desirable result (i.e the URL bar get focused). The only thing I can think of is completely removing the US-Qwerty layout, but I don't know how this can be achieved, and besides, it may not solve the problem. Thanks, Almog.

    Read the article

  • How can I parse Amazon S3 log files?

    - by artlung
    What are the best options for parsing Amazon S3 (Simple Storage) log files? I've turned on logging and now I have log files that look like this: 858e709ba90996df37d6f5152650086acb6db14a67d9aaae7a0f3620fdefb88f files.example.com [08/Jul/2010:10:31:42 +0000] 68.114.21.105 65a011a29cdf8ec533ec3d1ccaae921c 13880FBC9839395C REST.GET.OBJECT example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg "GET /example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg HTTP/1.1" 200 - 32957 32957 12 10 "http://atlanta.craigslist.org/forums/?act=Q&ID=163218891" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19" - What are the best options for automating the log files? I'm not using any other Amazon services other than S3.

    Read the article

  • Announcing Monthly Silverlight Giveaways on MichaelCrump.Net

    - by mbcrump
    I've been working with different companies to give away a set of Silverlight controls to my blog readers for the past few months. I’ve decided to setup this page and a friendly URL for everyone to keep track of my monthly giveaway. The URL to bookmark is: http://giveaways.michaelcrump.net. My main goal for giving away the controls is: I love for the Silverlight Community and giving away nice controls always sparks interest in Silverlight. To spread the word about a company and let the user decide if it meet their particular situation. I am a “control” junkie, it helps me solve my own personal business problems since I know what is out there. Provide some additional hits for my blog. Below is a grid that I will update monthly with the current giveaway. Feel free to bookmark this post and visit it monthly. Month Name Giveaway Description Blog Post Detailing Controls October 2010 Telerik Silverlight Controls –RadControls http://michaelcrump.net/archive/2010/10/15/win-telerik-radcontrols-for-silverlight-799-value.aspx November 2010 Mindscape Mindscape Mega-Pack http://michaelcrump.net/archive/2010/11/11/mindscape-silverlight-controls--free-mega-pack-contest.aspx December 2010 Infragistics Silverlight Controls + Silverlight Data Visualization http://michaelcrump.net/mbcrump/archive/2010/12/15/win-a-set-of-infragistics-silverlight-controls-with-data-visualization.aspx January 2011 Cellbi Cellbi Silverlight Controls http://michaelcrump.net/mbcrump/archive/2011/01/05/cellbi-silverlight-controls-giveaway-5-license-to-give-away.aspx February 2011 *BOOKED*     To Third Party Silverlight Companies: If you create any type of Silverlight control/application and would like to feature it on my blog then you may contact me at michael[at]michaelcrump[dot]net. Giving away controls has proven to be beneficial for both parties as I have around 4k twitter followers and average around 1000 page views a day. Contact me today and give back to the Silverlight Community.  Subscribe to my feed

    Read the article

  • New META TAGS with positive effects for seo ranking in 2011 and beyond

    - by Sam
    Hi all, im trying to make an up to date chart of meta tags, for all of us, with their purposes, their use and their good (or bad) effects on search engines/being found. Also any body knows new/promising meta tags? I will add yours into my list so this chart is a result of live discussion and up to date. Also, it would be creative to invent your own useful meta, because we are the ones making the web, or aren't we? LEGEND P PURPOSE? What does this meta tag do in 2011, if anything N NECESSARY? Does every site really needs it or not? G GOOD wether it will have a good effect for your site to be found I INVENTED meta tag, who knows it will be accepted in a year! META "METANAME" = PURPOSE? - NECESSARY? - GOOD EFFECT? #### important meta "title" = P consice summary + teaser - N very - G extremely meta "description" = P description + teaser - N yes - G very meta "robots" = P if needed, to skip default dmoz/yahoodir listing - N no - G? #### new & promising! Thanks for input (John, ) meta "original-source" P url of whoever broke the news gets credits - N? - G? meta "syndication-source" P url for syndication of published news - N? - G? meta "canonical" P? - N? - G? #### seems obsolete meta "keywords" = P some keywords - N+G not for google but yahoo likes them meta "language" = P overrule guesswork by defining language - N no - G? meta "page-topic" = P topic/theme - N? - G? meta "abstract" = P short summary - N? - G? meta "copyright" = ? #### invented by me meta "audience" = P filteres audience: "+seniors, +parents, -children, -youth" meta "mood" = P specifies textual style: "discussion, informative, commercial, sexual, fictional, scientific, romantic, therapeutic, technical"

    Read the article

  • Check_webinject plugin will not connect to https site using

    - by uSlackr
    We're using Nagios to monitor some of our web sites. We have a script that uses the older plugin that we are trying to switch to using webinject.pl from cpan. When the script runs, it generates this error: LWP::Protocol::https::Socket: SSL connect attempt failed with unknown error error:1407741A:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert decode error at /usr/local/share/perl5/LWP/Protocol/http.pm line 51. It appears the web site does not support TLSv1 for https. If it matters, the site is a Cisco WebVPN. I've pointed the same script at a different site that does support TLSv1 and it seems to work fine. My web search is coming up empty. Successful connect: <case id="1" description1="Metro Home Page" description2="Metro, login test" method="get" url="https://metro.myco.com/index.php" verifypositive="restricted" logrequest="yes" logresponse="yes" sleep="1" / Failing connect: <case id="2" description1="WebVPN Home Page" description2="webvpn.myco.com login test" method="get" url="https://webvpn.myco.com/webvpn.html" verifypositive="Authorized" logrequest="yes" logresponse="yes" sleep="1" /

    Read the article

  • path problem with mod_rewrite, XDebug, PDT, XAMPP and Windows XP

    - by Delirium tremens
    My mod_rewrite turns accounts/create into index.php?folder=accounts&action=create, but pdt ignores it, so when I try to start a PHP Script debug session, I have to type a folder location in the file field and pdt doesn't accept. When PDT auto generates the URL for the PHP Web Page debug session, I go to http://localhost/myframe/index.php?XDEBUG%5FSESSION%5FSTART=ECLIPSE%5FDBGP&KEY=12569067976875, but myframe is in the frameworks folder, so I get a 404 error. When I check a breakpoint, uncheck Auto Generate, add frameworks before myframe in URL, set Start Debug from http://localhost/frameworks/myframe/accounts/create in Advanced and click Debug, the debugger doesn't stop at the breakpoint.

    Read the article

  • Using .htaccess to protect direct access of files

    - by claydough
    We need to prevent direct access of files on our site from someone just entering a URL in their browser. I got this to work by using an htaccess file and it is fine in IE & Safari, but for some reason Firefox doesn't cooperate. I think it has something to do with the way Firefox reports referrers. Here is my code in the .htaccess file. RewriteEngine On RewriteBase / RewriteCond %{HTTP_REFERER} !^http://(my\.)?bigtimbermedia\.com/.*$ [NC] RewriteRule \.(swf|gif|png|jpg|doc|xls|pdf|html|htm|xlsx|docx)$ http://my.bigtimbermedia.com/ [R,L] If you want to see an example of this, try accessing this first... http://my.bigtimbermedia.com/books/bpGreyWolvesflip/index.html It blocks it properly in all browsers. Now if you go to this URL and click on the link, it works in IE and Safari, but Firefox chokes and seems like it is in a loop. Any ideas how I can get this to work in Firefox? Thanks!

    Read the article

  • Unit testing ASP.NET Web API controllers that rely on the UrlHelper

    - by cibrax
    UrlHelper is the class you can use in ASP.NET Web API to automatically infer links from the routing table without hardcoding anything. For example, the following code uses the helper to infer the location url for a new resource,public HttpResponseMessage Post(User model) { var response = Request.CreateResponse(HttpStatusCode.Created, user); var link = Url.Link("DefaultApi", new { id = id, controller = "Users" }); response.Headers.Location = new Uri(link); return response; } That code uses a previously defined route “DefaultApi”, which you might configure in the HttpConfiguration object (This is the route generated by default when you create a new Web API project). The problem with UrlHelper is that it requires from some initialization code before you can invoking it from a unit test (for testing the Post method in this example). If you don’t initialize the HttpConfiguration and Request instances associated to the controller from the unit test, it will fail miserably. After digging into the ASP.NET Web API source code a little bit, I could figure out what the requirements for using the UrlHelper are. It relies on the routing table configuration, and a few properties you need to add to the HttpRequestMessage. The following code illustrates what’s needed,var controller = new UserController(); controller.Configuration = new HttpConfiguration(); var route = controller.Configuration.Routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "api/{controller}/{id}", defaults: new { id = RouteParameter.Optional } ); var routeData = new HttpRouteData(route, new HttpRouteValueDictionary { { "id", "1" }, { "controller", "Users" } } ); controller.Request = new HttpRequestMessage(HttpMethod.Post, "http://localhost:9091/"); controller.Request.Properties.Add(HttpPropertyKeys.HttpConfigurationKey, controller.Configuration); controller.Request.Properties.Add(HttpPropertyKeys.HttpRouteDataKey, routeData);  The HttpRouteData instance should be initialized with the route values you will use in the controller method (“id” and “controller” in this example). Once you have correctly setup all those properties, you shouldn’t have any problem to use the UrlHelper. There is no need to mock anything else. Enjoy!!.

    Read the article

  • IIS7, different ports for websites but no portnumber in the browser

    - by Queensheep
    I have a windows server 2008 with IIS7 with 4 websites. In DNS I have 4 different URLs which point to the IP of the server. I configured each web site with the site bindings: website1: hostname: url1, port: 80, IP-Adresse: the adress of the server website2: hostname: url2, port: 80, IP-Adresse: the adress of the server The result is, that from the client, I can browse with all the 4 URLs to the specified web sites and everything is fine. Then I changed in IIS the port of the websites, so that website1 now uses port 8080, website2 uses port 8081, and so on. Now I have to use the browser with the url and the portnumber (like URL:8080). Is there a possibility, to configured the websites with different portnumbers but not to use the portnumbers in the browser?

    Read the article

  • Need help with Microsoft Access 07 & Reports

    - by Moe
    I'm finding it difficult to get MS reporting working to what I'd like to show. What I'm trying to do is: a) In my database store a URL file (HTTP external file), that is a .jpeg. I'd like to use that URL to call the image on the report sheet. I have tried to use 'Control source' on the data panel, but with no success. Any way I can get Dynamic Images to show up on each database. Also, I have a couple of Relational Databases. One Defines Values: For Example: DefinePets('petID','Name of Pet') The other one links the Main DB with the 'DefinePets' database. Eg: connect('petID','mainID','extraFeild') I'd like my report to Go into the "connect" Table, where the the currently viewed Record Value = mainID, then find petID and return Name of Pet. There is a many to many link between definePets and the main Table. (Therefore connect is joining them up) Or is that too much to ask from a simple package like Access?

    Read the article

  • TypeError: Cannot call method 'hasOwnProperty' of null, while creating a QMLscene window

    - by tomoqv
    I am trying to make a simple Ubuntu Touch web application with Qt Creator. I have set up a new project according to the tutorial and committed the files to Bazaar. I have set a url instead of the default index.htm in the qml file of the project. Using build-run loads a QML Scene window with the desired webpage, but Qt Creator yields the following output: Starting /usr/lib/i386-linux-gnu/qt5/bin/qmlscene -I /home/tomas/ubuntu-sdk/SL-planner -I /usr/bin -I /usr/lib/i386-linux-gnu/qt5/qml /home/tomas/ubuntu-sdk/SL-planner/SL-planner.qml unity::action::ActionManager::ActionManager(QObject*): Could not determine application identifier. HUD will not work properly. Provide your application identifier in $APP_ID environment variable. file:///usr/lib/i386-linux-gnu/qt5/qml/Ubuntu/Components/MainView.qml:257: TypeError: Cannot call method 'hasOwnProperty' of null My SL-planner.qml looks like this: import QtQuick 2.0 import Ubuntu.Components 0.1 import QtWebKit 3.0 /*! \brief MainView with a Flickable WebView. */ MainView { // objectName for functional testing purposes (autopilot-qt5) objectName: "mainView" // Note! applicationName needs to match the "name" field of the click manifest applicationName: "com.ubuntu.developer.tomoqv.SL-planner" /* This property enables the application to change orientation when the device is rotated. The default is false. */ automaticOrientation: true width: units.gu(100) height: units.gu(75) Flickable { id: webViewFlickable anchors.fill: parent WebView { id: webView anchors.fill: parent url: "http://mobil.sl.se" } } } What am I missing?

    Read the article

  • how do I write a functional specification quickly and efficiently

    - by giddy
    So I just read some fabulous articles by Joel on specs here. (Was written in 2000!!) I read all 4 parts, but Im looking for some methodical approaches to writing my specs. Im the only lonely dev, working on this fairly complicated app (or family of apps) for a very well known finance company. I've never made something this serious, I started out writing something like a bad spec, an overview of some sorts, and it has wasted a LOT of my time. Ive also made 3 mockup-kinda-thingies for my client so I have a good understanding of what they want. Also released a preview (a throw away working app with the most basic workflow), and Ive only written and tested some of the very core/base systems. I think the mistake Ive been making so far is not writing a detailed spec, so Im getting to it now. So the whole thing comprises of An MVC website (for admins & data viewing) 2 Silverlight modules (For 2 specific tasks) 1 Desktop Application Im totally short on time, resources and need to get this done quick, also, need to make sure these guys read it up equally quick and painlessly. So how do I go about it, Im looking for any tips, any real world stuff, how do you guys usually do it? Do you make a mock screenie of every dialog/form/page? Im thinking of making a dummy asp.net web forms project, then filling in html files in folders and making it look like my mvc url structure. Then having a section in the spec for the website and write up a page for every URL Ive got with a screenie. For my win forms app, Ive made somewhat of a demo Win Form project, would I then put in a dialog or stucture everything as I would in the real app and then screen shot it?

    Read the article

  • SetEnvIf regex for setting Content-Disposition HTTP header

    - by Erik Sorensen
    I am attempting to use the IHS 7.0/apache 2.2 SetEnvIf directive to set the filename of a downloaded file based on a url parameter. I think I am pretty close, however if there is a space (encoded or otherwise) in the filename - it fails. example url: http://site.com/path/to/filename.ext/file-title=Nice File Name.ext?file-type=foo apache config: SetEnvIf Request_URI "^.*file-title\=(.*)\??.*$" FILENAME=$1 Header unset "Content-Disposition" Header add "Content-Disposition" "attachment; filename=%{FILENAME}e" UnsetEnv FILENAME An application will specify what is now showing up as "Nice File Title.ext" in the example. This all works great if there are no spaces, however - if there is a space the filename to download will just show up as "Nice". There may or may not be a second set of parameters in the query string (?file-type, etc)

    Read the article

  • Make Chrome's Omnibar behave more like the Firefox AwesomeBar

    - by Agnel Kurian
    One of my favorite features of the Firefox AwesomeBar is that I can simply type a substring of any URL or page title in my history and it finds all matches sorted by how frequently they were accessed. Example: I simply type "ask" when I want to ask something on stackoverflow.com., "inbox" goes to my GMail Inbox and so on because the substring matches any part of the URL or the page title. Chrome's Omnibar is quite frustrating in this area. I am not able to predict what it's gonna fetch and I seem to have no way to train the thing to do my bidding. I have unchecked the option that says: "Use a suggestion service to help complete searches and URLs typed..." but there has been no noticeable improvement. Any clues how I can make the Omnibar behave?

    Read the article

  • Breadcrumbs RDFA

    - by Saahil Sinha
    Have implemented Breadcrumb RDFA http://www.mycarhelpline.com/index.php?option=com_forms&view=pages&layout=sellcar&Itemid=4 While checking the page , the RDFA Data shows property: title: Home https://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.mycarhelpline.com%2Findex.php%3Foption%3Dcom_forms%26view%3Dpages%26layout%3Dsellcar%26Itemid%3D4 However, when i compare ours with others http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Froyalenfield.com%2Fmotorcycles%2Fthunderbird-500%2F The Title and Description of the current page is shown in every RDFA Data, which is not shown in ours If someone could suggest - how to get the page title and description show up in RDFA Data, below is our breadcrumb code <p><span class="breadcrumbs pathway"> <span typeof="v:Breadcrumb"> <a href="" rel="v:url" property="v:title">Home</a> &raquo; <span rel="v:child"> <span typeof="v:Breadcrumb"> <a href="index.php?option=com_forms&view=pages&layout=selloldcarindelhi&Itemid=4" rel="v:url" property="v:title">Sell Car</a> &raquo; <span rel="v:child"> <span typeof="v:Breadcrumb"> <a property="v:title" >Sell Used Car</a> </span> </span> </span> </span> </span>

    Read the article

  • Odd squid transparent redirect behavior

    - by EMiller
    This is the first time I've set up squid. It's running a redirect script that does some text search/replace on html pages, and then saves them to a location on the same machine on the nginx path - then issues the redirect to that URL (it's an art project :D). The relevant lines in squid.conf are http_port 3128 transparent redirect_program /etc/squid/jefferson_redirect.py The jefferson_redirect.py script is based on this script: http://gofedora.com/how-to-write-custom-redirector-rewritor-plugin-squid-python/ The issue: I'm getting strange http redirect behavior. For example, here is the normal request/response from a PHP script that issues a header("Location:"); - a 302 redirect: http://redirector.mysite.com/?unicmd=g+yreka GET /?unicmd=g+yreka HTTP/1.1 Host: redirector.mysite.com User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-1.fc12 Firefox/3.5.9 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Connection: keep-alive HTTP/1.1 302 Found Date: Tue, 13 Apr 2010 05:15:43 GMT Server: Apache X-Powered-By: PHP/5.2.11 Location: http://www.google.com/search?q=yreka Content-Type: text/html Vary: User-Agent,Accept-Encoding Content-Encoding: gzip Content-Length: 2108 Keep-Alive: timeout=3, max=100 Connection: Keep-Alive Here's what it looks like when running through the squid proxy (note that "redirector.mysite.com" is not the site running squid or nginx): http://redirector.mysite.com/?unicmd=g+yreka GET /?unicmd=g+yreka HTTP/1.1 Host: redirector.mysite.com User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-1.fc12 Firefox/3.5.9 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Proxy-Connection: keep-alive If-Modified-Since: Tue, 13 Apr 2010 05:21:02 GMT HTTP/1.0 200 OK Server: nginx/0.7.62 Date: Tue, 13 Apr 2010 05:21:10 GMT Content-Type: text/html Content-Length: 17865 Last-Modified: Tue, 13 Apr 2010 05:21:10 GMT Accept-Ranges: bytes X-Cache: MISS from jefferson X-Cache-Lookup: HIT from jefferson:3128 Via: 1.1 jefferson:3128 (squid/2.7.STABLE6) Connection: keep-alive Proxy-Connection: keep-alive It is basically working - but the URL http://redirector.mysite.com/?unicmd=g+yreka remains unchanged, while displaying the google page (mostly broken as it's using URLs relative to redirector.mysite.com) I've experienced a similar thing with google results pages: when clicking to another page from google, I get a google URL, with the other site's content. Sorry for the long post - many thanks if you've read this far! Any ideas?

    Read the article

< Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >