Search Results

Search found 17054 results on 683 pages for 'jms request reply'.

Page 336/683 | < Previous Page | 332 333 334 335 336 337 338 339 340 341 342 343  | Next Page >

  • Linux/Unix MTA with the smartest queue?

    - by threecheeseopera
    I am looking for an MTA that will allow me (a script, really) to proactively manage it's send queue in response to status codes returned by the remote servers I am delivering to. Basically, for each mail sent I would like to be able to react to the SMTP reply code returned by the remote server, ex. '250 OK', or to any error conditions like connection timeouts. Additionally, I would like to be able to manage the send queue moving forward based on this information, e.g. 'example.com has timed out the last 5 connection attempts, so no longer queue mail for recipients @example.com'. I am currently using postfix and perl to parse it's logs for this information, but I am playing a game of catchup that is prone to errors (out-of-order log entries etc.) and it's starting to get messy (some real ugly regexes ;). I really don't want to reinvent the wheel and use some language's smtp library; i would prefer to use a proven/fast/reliable MTA. I am however open to suggestions if what I need just isn't possible. Thanks for your help!

    Read the article

  • CopSSH SFTP -- limit users access to their home directory only

    - by bradvido
    Let me preface this by saying I've read and followed these instructions at the FAQ many times: http://www.itefix.no/i2/node/37 It does not do what the title claims... It allows every user access to every other user's home directory, as well as access to all subfolders below the copssh installation path. I'm only using this for SFTP access and I need my users to be sandboxed into only their home directory. If you know a fool-proof way to lock users down so they can see only their home directory and its subfolders, stop reading now and reply with the solution. The details: Here is exactly what i tried as I followed the FAQ. My copSSH installation directory is: C:\Program Files\CopSSH net localgroup sftp_users /ADD **Create a user group to hold all my SFTP users cacls c:\ /c /e /t /d sftp_users **For that group, deny access at the top level and all levels below cacls "C:\Program Files\CopSSH" /c /e /t /r sftp_users **Allow my user group access to the copSSH installation directory and its subdirectories For each sftp user, I create a new windows user account, then I: net localgroup sftp_users sftp_user_1 /add **Add my user to the group I've created Open the activate user wizard for CopSSH, choosing the user, "/bin/sftponly" and Remove copssh home directory if it exists **Remains checked Create keys for public key authentication **Remains checked Create link to user's real home directory **Remains checked This works, however, every user has access to every other user's home directory as well as the CopSSH root directory.... So I tried denying access for all users to the user home directory: cacls "C:\Program Files\CopSSH\home" /c /e /t /d sftp_users **Deny access for users to the user home directory Then I tried adding permissions on a user-by-user basis for each users home\username folder. However,these permission were not allowed by windows because of the above deny rule i created at the home directory was being inherited and over-riding my allow rule. The next step for me would be to remove the deny rule at the home directory and for each user folder, add a deny rule for every user it doesn't belong to, and add an allow rule for the one user it does belong to. However, as my user list gets long, this will become very cumbersome. Thanks for the help!

    Read the article

  • How to fix subfolders IIS7 functionality?

    - by Amr ElGarhy
    I have a problem in my sharing hosting that all websites in subfolders, their URL appear like this: http://amrelgarhy.com/amrelgarhy/ I sent to godaddy, and they sent me that its because of IIS7 and they can't solve, any one can tell me how to fix that? Here what i sent to godaddy and their reply: "as i saw before on this page http://www.godaddy.com/gdshop/hosting/shared.asp?ci=9009 compare windows plans, "Multiple Web sites: unlimited" so i have the right to run more than one website inside my hosting. But what i am facing now that i can't make more than website as a primary website. I have igurr.com as a primary website, i want to make others as primary because: I am facing a problem that all home pages for the other websites "which physically in sub folders" are like that "http://amrelgarhy.com/amrelgarhy/" the URL + the folder name and that what i don't want." GODADDY "Thank you for contacting Hosting Support. The behavior you are describing is standard for IIS 7.0 accounts. All alias domains in this environment will append the foldername their located in. I.E. a an alias domain www.coolexample.com pointed to the '/example' directory will display in a browser as "www.coolexample.com/example". This is due to the way IIS 7.0 handles virtual directories. Unfortunately we do not have any direct work around for this. We apologize for any inconvenience this may cause. "

    Read the article

  • ipv6 : why ndp resolves to global scope address?

    - by Julien
    I'm facing a strange ipv6 behavior and I don't know how to solve it because I'm not familiar with ipv6. Maybe this behavior is normal. I hope that you will help me. ( I'm running under debian 6.0.9 with a custom kernel 3.2.58 ) machine A is "2a00:7d30:edf6:100::1" wants to ping machine B, which is "2a00:7d30:edf6:100::10". Both are on the same segment. machine A asks for the address of machine B and I don't understand why machine B gives its global scope address instead of the local scope one ? 10:59:02.082785 IP6 2a00:7d30:edf6:100::1 ff02::1:ff00:10: ICMP6, neighbor solicitation, who has 2a00:7d30:edf6:100::10, length 32 10:59:02.082821 IP6 2a00:7d30:edf6:100::10 2a00:7d30:edf6:100::1: ICMP6, neighbor advertisement, tgt is 2a00:7d30:edf6:100::10, length 32 after that machine A pings the global scope address of machine B and it works fine : 10:59:02.082927 IP6 2a00:7d30:edf6:100::1 2a00:7d30:edf6:100::10: ICMP6, echo request, seq 1, length 64 10:59:02.082960 IP6 2a00:7d30:edf6:100::10 2a00:7d30:edf6:100::1: ICMP6, echo reply, seq 1, length 64 Thank you for you help best regards Julien

    Read the article

  • Implementing a "state-machine" logic for methods required by an object in C++

    - by user827992
    What I have: 1 hypothetical object/class + other classes and related methods that gives me functionality. What I want: linking this object to 0 to N methods in realtime on request when an event is triggered Each event is related to a single method or a class, so a single event does not necessarily mean "connect this 1 method only" but can also mean "connect all the methods from that class or a group of methods" Avoiding linked lists because I have to browse the entire list to know what methods are linked, because this does not ensure me that the linked methods are kept in a particular order (let's say an alphabetic order by their names or classes), and also because this involve a massive amount of pointers usage. Example: I have an object Employee Jon, Jon acquires knowledge and forgets things pretty easily, so his skills may vary during a period of time, I'm responsible for what Jon can add or remove from his CV, how can I implement this logic?

    Read the article

  • Google De-Index many pages at once?

    - by Jakobud
    On one of our websites, Google has been indexing something it wasn't supposed to. We fixed the problem so it shouldn't happen anymore, but are interested in requesting that Google de-index these pages. The problem is that there are about 10,000 pages. They all look similar to this: http://www.mysite.com/file.php?o=34995 http://www.mysite.com/file.php?o=4566 http://www.mysite.com/file.php?o=223af http://www.mysite.com/file.php?o=6ga3h http://www.mysite.com/file.php?o=sfh45a etc... All the pages are file.php with get parameters like above. Is it possible to put in a de-index request like: http://www.mysite.com/file.php* so that Google removes all 10,000 pages?

    Read the article

  • iRedMail home setup - use different SMTP relay for different destination domains

    - by John
    Hello helpful server folks, I'm messing with iRedMail. I've mostly been successful, I think I have an SMTP problem. I have changed RoundCube (webmail) to use BrightHouse's, my ISP's, SMTP server for outgoing. It works fine, I click send and poof, I have gmail. I can reply from gmail to my email server, and it works. It took 10 hours for the email to show up, which is a different problem, I think, but it does work. But when I send from my server TO my own server, my ISP's Postmaster account sends me a cryptic blurb. I just got off the phone with them, and they say it "should work", and that they can't reach my pop3 server. (pop3, pop3s, imap, and imaps are all open on my router and forwarded to the server, I'm not sure what I need, I'm just covering my bases...) pop3 and/or imap as external interfaces are just formalities, I really just want webmail to work. Roundcube only takes one SMTP server in its configs. How can I configure Postfix to relay / forward emails to my ISP's SMTP, while taking messages bound for my own domain and processing them? Since my ISP won't let me "bounce" my emails off of it. Maybe I'm vastly misunderstanding how e-mail works in general: To receive mail, I should only need port 25, SMTP, open to the internet, correct? Should I be concerned about some authentication failure from the outside to my relay? (My relay requires user/pass to use, my ISP's requires none.)

    Read the article

  • Transitioning to asynchronous programming model

    - by Simone
    our team is mantaining and developing a .NET web service written in C#. We have stress tested the web service's farm and we have evidence that the actual architecture doesn't scale well, as the number of request are constantly increasing. We analyzed Martin Fowler's conclusion in this article, and our team feels that migrating to an asynchronous programming model such as the one described could be the right direction to point to for our service too. My question is: do you think that this "switch" needs a complete rewrite of the application? Has been someone of you been able to adopt APM without rewriting everything and has some insight to share? Thank you in advance

    Read the article

  • Chrome Rewrite of Host: in HTML GET

    - by user912679
    At some point in the past I had a plugin for firefox that rewrites the HTML headers being sent by your browser, specifically the "Host:" line in the HTML GET request. I can't find this plugin online. Does anyone know a plugin/way to do this? I am looking for one for Chrome but any would work. The specific reason for this is I am trying to work on a wordpress website which I just did a DNS change on. Until that DNS change goes into effect I can use the IP but since its a shared host the Host line isn't set right.

    Read the article

  • Fun With the Chrome JavaScript Console and the Pluralsight Website

    - by Steve Michelotti
    Originally posted on: http://geekswithblogs.net/michelotti/archive/2013/07/24/fun-with-the-chrome-javascript-console-and-the-pluralsight-website.aspxI’m currently working on my third course for Pluralsight. Everyone already knows that Scott Allen is a “dominating force” for Pluralsight but I was curious how many courses other authors have published as well. The Pluralsight Authors page - http://pluralsight.com/training/Authors – shows all 146 authors and you can click on any author’s page to see how many (and which) courses they have authored. The problem is: I don’t want to have to click into 146 pages to get a count for each author. With this in mind, I figured I could write a little JavaScript using the Chrome JavaScript console to do some “detective work.” My first step was to figure out how the HTML was structured on this page so I could do some screen-scraping. Right-click the first author - “Inspect Element”. I can see there is a primary <div> with a class of “main” which contains all the authors. Each author is in an <h3> with an <a> tag containing their name and link to their page:     This web page already has jQuery loaded so I can use $ directly from the console. This allows me to just use jQuery to inspect items on the current page. Notice this is a multi-line command. In order to use multiple lines in the console you have to press SHIFT-ENTER to go to the next line:     Now I can see I’m extracting data just fine. At this point I want to follow each URL. Then I want to screen-scrape this next page to see how many courses each author has done. Let’s take a look at the author detail page:       I can see we have a table (with a css class of “course”) that contains rows for each course authored. This means I can get the number of courses pretty easily like this:     Now I can put this all together. Back on the authors page, I want to follow each URL, extract the returned HTML, and grab the count. In the code below, I simply use the jQuery $.get() method to get the author detail page and the “data” variable that is in the callback contains the HTML. A nice feature of jQuery is that I can simply put this HTML string inside of $() and I can use jQuery selectors directly on it in conjunction with the find() method:     Now I’m getting somewhere. I have every Pluralsight author and how many courses each one has authored. But that’s not quite what I’m after – what I want to see are the authors that have the MOST courses in the library. What I’d like to do is to put all of the data in an array and then sort that array descending by number of courses. I can add an item to the array after each author detail page is returned but the catch here is that I can’t perform the sort operation until ALL of the author detail pages have executed. The jQuery $.get() method is naturally an async method so I essentially have 146 async calls and I don’t want to perform my sort action until ALL have completed (side note: don’t run this script too many times or the Pluralsight servers might think your an evil hacker attempting a DoS attack and deny you). My C# brain wants to use a WaitHandle WaitAll() method here but this is JavaScript. I was able to do this by using the jQuery Deferred() object. I create a new deferred object for each request and push it onto a deferred array. After each request is complete, I signal completion by calling the resolve() method. Finally, I use a $.when.apply() method to execute my descending sort operation once all requests are complete. Here is my complete console command: 1: var authorList = [], 2: defList = []; 3: $(".main h3 a").each(function() { 4: var def = $.Deferred(); 5: defList.push(def); 6: var authorName = $(this).text(); 7: var authorUrl = $(this).attr('href'); 8: $.get(authorUrl, function(data) { 9: var courseCount = $(data).find("table.course tbody tr").length; 10: authorList.push({ name: authorName, numberOfCourses: courseCount }); 11: def.resolve(); 12: }); 13: }); 14: $.when.apply($, defList).then(function() { 15: console.log("*Everything* is complete"); 16: var sortedList = authorList.sort(function(obj1, obj2) { 17: return obj2.numberOfCourses - obj1.numberOfCourses; 18: }); 19: for (var i = 0; i < sortedList.length; i++) { 20: console.log(authorList[i]); 21: } 22: });   And here are the results:     WOW! John Sonmez has 44 courses!! And Matt Milner has 29! I guess Scott Allen isn’t the only “dominating force”. I would have assumed Scott Allen was #1 but he comes in as #3 in total course count (of course Scott has 11 courses in the Top 50, and 14 in the Top 100 which is incredible!). Given that I’m in the middle of producing only my third course, I better get to work!

    Read the article

  • Gateway IP for eth0 and gateway IP for pptp vpn are same

    - by user286630
    My problem is.. 1) I'm using laptop at school. 2) In school, the default gateway for ethernet is 192.168.1.1. 3) I want to connect to a pptp vpn server. The gateway over the vpn connection is also 192.68.1.1. (The VPN server assigns 192.168.100.1 to my laptop and I confirmed that it is not used in school.) In this situation, there is no problem in Windows 7. I think it is enough smart to distinguish two different gateways with the same IP. All connection requests may be forwarded to the vpn gateway. But, in Ubuntu, I cannot access a file server in the remote site. I guess every connection request is forwarded to the ethernet gateway. How can I send all connection requests to the vpn gateway whose IP is same as the ethernet gateway?

    Read the article

  • google changing crawl speed: doesn't seem to work. Why?

    - by Olivier Pons
    I've changed 3 days ago the google crawling speed of mywebsite. Here it is: This means: 2 demands by second. I've got the message on the google webmasters tools that the change speed has been taken in account: But after more than three days, nothing happens: still one request every ten seconds See here: My webserver is very fast and can handle up to twenty simultaneous connexions. And my website is brand new, this means google is almost the only one here crawling my website. After more than 30000 successful requests (= no 404), I think there's something going on... or maybe this is just a bug? Has anyone ever had this problem?

    Read the article

  • SFTP only works occasionally

    - by 82din
    I suddenly get this error using SFTP: Status: Connecting to example.com... Response: fzSftp started Command: open "[email protected]" 22 Command: Pass: ********* Status: Connected to example.com Status: Retrieving directory listing... Command: pwd Response: Current directory is: "/root" Command: ls Status: Listing directory /root Error: Connection timed out Error: Failed to retrieve directory listing I tried using FileZila, Cyberduck, Shell (Terminal), same result. However, it worked fine today (just a few seconds) in Passive mode. I guess something changed in my network, so I have tried both: Active and Passive mode: Connecting to probe.filezilla-project.org Response: 220 FZ router and firewall tester ready USER FileZilla Response: 331 Give any password. PASS 3.6.0.2 Response: 230 logged on. Checking for correct external IP address Retrieving external IP address from http://checkip.dyndns.org:8245/ Checking for correct external IP address IP <external IP> big-bf-ccc-f Response: 200 OK PREP 49565 Response: 200 Using port 49565, data token 380352881 PORT 186,15,222,5,193,157 Response: 200 PORT command successful LIST Response: 150 opening data connection Response: 503 Failure of data connection. Server sent unexpected reply. Connection closed Because I'm working behind a router, I get my external IP from http://checkip.dyndns.org:8245/ I also tested different range of ports.

    Read the article

  • two computers on same network cannot ping eachother nor view NetBios resources

    - by slava
    I'd like to find out the problem of my network configuration I have network configuration is like in this diagram: The problem is between laptop1 and laptop2. At first I thought it was samba server problem. I was configuring samba server on one of the laptops and I wasn't able to access the shares from the second laptop no matter what I was doing. After installing/removing/configuring samba-server a couple of times I realized that the problem resides somewhere else. Laptop configurations: - Laptop1: ubuntu 12.04 - Laptop2: Windows 7/ ubuntu 12.04 ( dual boot ) - Server : ubuntu 12.04 When I do "ping 192.168.0.10" from laptop2, I get "Destination host unreachable". The same situation is when I ping in other direction. When I access Laptop1 shares from Laptop2, having windows 7 loaded, I get the error message: "Error code: 0x80070035 The network path was not found." When I ping "server" or "router" or "wifi router" from any of laptops I get a reply. The same with windows shares, I am able to access "server"s shares from Windows and Ubuntu, from any of my laptops. Netbios can't function correctly, that's obvious, I am unable to access windows shares between laptops. I assume that on "wifi router" is a miss-configuration, but I can't find what specifically. The "Wifi router" works as Hub + wifi, it is connected to "router" not in WAN port but in LAN1. Please, help me correctly configure the router to make them see each-other, or at least make NetBios work correctly, between laptops, to be able to access windows shares. Thanks!

    Read the article

  • Xoom Giveaway Courtesy of the Complete Android Guide [Giveaway]

    - by Jason Fitzpatrick
    If you’re an Android fan and looking to score an Android 3.0 tablet, you can enter to win a Xoom tablet courtesy of the Complete Android Guide. What do you need to do? Per their official rules: Contribute content to the site. To do so: Sign up (via the Register link in the top-right corner). Email android ‘at’ completeguides ‘dot’ net and request contributor access to this site. Write a killer tutorial, reference or chapter for the book.  Buy the book, in paperback or ebook form.  The deadline is March 31, the winner will be drawn in in April. Note: The link to the officials rules appears defunct, we’ll update shortly when the URL is fixed. Xoom Drawing @ Complete Android Guide [Complete Guides] How To Make a Youtube Video Into an Animated GIFHTG Explains: What Are Character Encodings and How Do They Differ?How To Make Disposable Sleeves for Your In-Ear Monitors

    Read the article

  • What commands are needed to install Ubuntu Core?

    - by Oxwivi
    Ubuntu Core's wiki page page contains the instructions to install Ubuntu Core on a target media: Uncompress (do not unpack) rootfs Format target media: at least one partition should be ext2, ext3, or ext4 Unpack Ubuntu Core to ext{2,3,4} partition Install boot-loader Install Linux If the Linux kernel requires modules, add these to /lib/modules/$(uname -r) in the ext{2,3,4} file system Boot the target device Install any additional required software using apt-get But what are the specific commands to do the above? The things I'm specifically confused about are: Uncompressing and unpacking, what's the difference and how do I do them? What package should I install if I want the generic kernel provided in regular Ubuntu installation? I won't be installing any drivers or anything related to kernel other than what's provided in the repos, do I need to worry about manually adding kernel modules? PS I would like to request that all the commands used in the installation process be mentioned in the answer, for the benefit of ones who're completely unfamiliar and myself, should I ever forget.

    Read the article

  • ssh key questions

    - by Tim
    I have some questions regarding generating keys for ssh access: (1) Supposed there are two computers running ssh server service and I have generated a pair of key files on computer A and copy the public file to computer B. Is it true that this is only a one-way key: We only gave computer A permission to access computer B, not gave computer B permission to access computer A? If I now want to ssh from computer B to computer A, must I generat another pair of key files on computer B and copy the public file to computer A? (2) If I would like to connect a single local computer to several remote servers, is it to generate a common pair of key files only once on the local and copy the same public file to the remote servers, or to generate different pair of key files on the local for different remote servers? (3) If I would like to connect several local computers to a single remote server, when copying the public files from different local computers to the remote server, is it to combine them together into a single authorized_keys file or store them in different authorized_keys files? (4) If there are several servers shared the same file system by, for example, NFS, how to generate keys and arrange the key files for accessing from one server to the other? Also how to still generate keys and arrange the key files for a local computer to access anyone of the servers? All the machines above are Linux.Please provide examples and commands in your reply so that I can better understand how to solve the problems. Thanks and regards!

    Read the article

  • Timeout Considerations for Solicit Response – Part 2

    - by Michael Stephenson
    To follow up a previous article about timeouts and how they can affect your application I have extended the sample we were using to include WCF. I will execute some test scenarios and discuss the results. The sample We begin by consuming exactly the same web service which is sitting on a remote server. This time I have created a .net 3.5 application which will consume the web service using the basichttp binding. To show you the configuration for the consumption of this web service please refer to the below diagram. You can see like before we also have the connectionManagement element in the configuration file. I have added a WCF service reference (also using the asynchronous proxy methods) and have the below code sample in the application which will asynchronously make the web service calls and handle the responses on a call back method invoked by a delegate. If you have read the previous article you will notice that the code is almost the same.   Sample 1 – WCF with Default Timeouts In this test I set about recreating the same scenario as previous where we would run the test but this time using WCF as the messaging component. For the first test I would use the default configuration settings which WCF had setup when we added a reference to the web service. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"   The Test We simulated 21 calls to the web service Test Results The client-side trace is as follows:   The server-side trace is as follows: Some observations on the results are as follows: The timeouts happened quicker than in the previous tests because some calls were timing out before they attempted to connect to the server The first few calls that timed out did actually connect to the server and did execute successfully on the server   Test 2 – Increase Open Connection Timeout & Send Timeout In this test I wanted to increase both the send and open timeout values to try and give everything a chance to go through. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:10:00"   The Test We simulated 21 calls to the web service   Test Results The client side trace for this test was   The server-side trace for this test was: Some observations on this test are: This test proved if the timeouts are high enough everything will just go through   Test 3 – Increase just the Send Timeout In this test we wanted to increase just the send timeout. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:10:00"   The Test We simulated 21 calls to the web service   Test Results The below is the client side trace The below is the server side trace Some observations on this test are: In this test from both the client and server perspective everything ran through fine The open connection timeout did not seem to have any effect   Test 4 – Increase Just the Open Connection Timeout In this test I wanted to validate the change to the open connection setting by increasing just this on its own. The timeout values for this test are: closeTimeout="00:01:00" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"   The Test We simulated 21 calls to the web service Test Results The client side trace was The server side trace was Some observations on this test are: In this test you can see that the open connection which relates to opening the channel timeout increase was not the thing which stopped the calls timing out It's the send of data which is timing out On the server you can see that the successful few calls were fine but there were also a few calls which hit the server but timed out on the client You can see that not all calls hit the server which was one of the problems with the WSE and ASMX options   Test 5 – Smaller Increase in Send Timeout In this test I wanted to make a smaller increase to the send timeout than previous just to prove that it was the key setting which was controlling what was timing out. The timeout values for this test are: openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:02:30"   The Test We simulated 21 calls to the web service Test Results The client side trace was   The server side trace was Some observations on this test are: You can see that most of the calls got through fine On the client you can see that call 20 timed out but still hit the server and executed fine.   Summary At this point between the two articles we have quite a lot of scenarios showing the different way the timeout setting have played into our original performance issue, and now we can see how WCF could offer an improved way to handle the problem. To summarise the differences in the timeout properties for the three technology stacks: ASMX The timeout value only applies to the execution time of your request on the server. The timeout does not consider how long your code might be waiting client side to get a connection. WSE The timeout value includes both the time to obtain a connection and also the time to execute the request. A timeout will not be thrown as an error until an attempt to connect to the server is made. This means a 40 second timeout setting may not throw the error until 60 seconds when the connection to the server is made. If the connection to the server is made you should be aware that your message will be processed and you should design for this. WCF The WCF send timeout is the setting most equivalent to the settings we were looking at previously. Like WSE this setting the counter includes the time to get a connection as well as the time to execute on a server. Unlike WSE and ASMX an error will be thrown as soon as the send timeout from making your call from user code has elapsed regardless of whether we are waiting for a connection or have an open connection to the server. This may to a user appear to have better latency in getting an error response compared to WSE or ASMX.

    Read the article

  • The ABC of Front End Web Development

    - by Geertjan
    And here it is, the long awaited "ABC" of front end web development, in which the items I never knew existed until I was looking to fill the gaps link off to the sites where more info can be found on them. A is for Android and AngularJS B is for Backbone.js and Bower C is for CSS and Cordova D is for Docker E is for Ember.js and Ext JS F is for Frisby.js G is for Grunt H is for HTML I is for Ionic and iPhone J is for JavaScript, Jasmine, and JSON K is for Knockout.js and Karma L is for LESS M is for Mocha N is for NetBeans and Node.js O is for "Oh no, my JS app is unmaintainable!" P is for PHP, Protractor, and PhoneGap Q is for Queen.js R is for Request.js S is for SASS, Selenium, and Sublime T is for TestFairy U is for Umbrella V is for Vaadin W is for WebStorm X is for XML Y is for Yeoman Z is for Zebra

    Read the article

  • Download the Windows 8 Logo and Icons to Use on Your Favorite Computer

    - by Asian Angel
    Do you like how the icons from Windows 8 look and want to use them on a different system? Then you are in luck! The good folks over at 7 Tutorials have pulled out nineteen icons from Windows 8 and packaged them into a downloadable set. Note: White icons not shown above. Download the Windows 8 Logo & Other Windows 8 Icons [7 Tutorials] Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • Vorsprung für Partner – auch beim Support

    - by Alliances & Channels Redaktion
    Solider Support ist für Oracle eine Selbstverständlichkeit, das ist nichts Neues. Aber wussten Sie auch, dass Oracle Support für Partner besondere Konditionen und Tools anbietet? Der Weg dorthin ist ganz einfach: Loggen Sie sich in das OPN-Portal ein. Über den Klickpfad „Partner with Oracle“, „Get startet“, „Levels and Benefits“ und „View all benefits“ gelangen Sie zu einer Übersicht, welches Level welche Support Benefits mit sich bringt. Als Partner erhalten Sie eine eigene Oracle Partner SI Nummer, sprich einen Support Identifier, der den Zugriff auf die Wissensdatenbank, technische Unterlagen, den Patch Download Bereich und verschiedene Communities im Support Portal „My Oracle Support“ eröffnet. Zudem haben Sie selbstverständlich die Möglichkeit, Service Request (SR) Pakete zu kaufen. Je nach Partner Level verfügen Sie über eine bestimmte Menge an freien Service Requests. Deren Zahl können Sie mit jeder weiteren Spezialisierung vermehren. Und: Beim Support-Einkauf für den Eigenbedarf erhalten unsere Partner einen Preisnachlass. Ein Blick ins OPN-Portal lohnt sich also auch in Support-Fragen!

    Read the article

  • Vorsprung für Partner – auch beim Support

    - by Alliances & Channels Redaktion
    Solider Support ist für Oracle eine Selbstverständlichkeit, das ist nichts Neues. Aber wussten Sie auch, dass Oracle Support für Partner besondere Konditionen und Tools anbietet? Der Weg dorthin ist ganz einfach: Loggen Sie sich in das OPN-Portal ein. Über den Klickpfad „Partner with Oracle“, „Get startet“, „Levels and Benefits“ und „View all benefits“ gelangen Sie zu einer Übersicht, welches Level welche Support Benefits mit sich bringt. Als Partner erhalten Sie eine eigene Oracle Partner SI Nummer, sprich einen Support Identifier, der den Zugriff auf die Wissensdatenbank, technische Unterlagen, den Patch Download Bereich und verschiedene Communities im Support Portal „My Oracle Support“ eröffnet. Zudem haben Sie selbstverständlich die Möglichkeit, Service Request (SR) Pakete zu kaufen. Je nach Partner Level verfügen Sie über eine bestimmte Menge an freien Service Requests. Deren Zahl können Sie mit jeder weiteren Spezialisierung vermehren. Und: Beim Support-Einkauf für den Eigenbedarf erhalten unsere Partner einen Preisnachlass. Ein Blick ins OPN-Portal lohnt sich also auch in Support-Fragen!

    Read the article

  • Where should I draw the line between unit tests and integration tests? Should they be separate?

    - by Earlz
    I have a small MVC framework I've been working on. It's code base definitely isn't big, but it's not longer just a couple of classes. I finally decided to take the plunge and start writing tests for it(yes, I know I should've been doing that all along, but it's API was super unstable up until now) Anyway, my plan is to make it extremely easy to test, including integration tests. An example integration test would go something along these lines: Fake HTTP request object - MVC framework - HTTP response object - check the response is correct Because this is all doable without any state or special tools(browser automation etc), I could actually do this with ease with regular unit test frameworks(I use NUnit). Now the big question. Where exactly should I draw the line between unit tests and integration tests? Should I only test one class at a time(as much as possible) with unit tests? Also, should integration tests be placed in the same testing project as my unit testing project?

    Read the article

  • Rails solution for mobile-specific content filter?

    - by Damien Roche
    To note, I'm not interested in simply 'hiding' content for mobile devices, I want to filter out that content completely. I'm also not trying to address the issue by building a mobile specific interface (mob.example.com). There was another question regarding something similar: How do I prevent useless content load on the page in responsive design? The solution, in that post, was to set a session during the initial request, and then use the session to filter content on subsequent requests. I primarily develop in Rails, and I'm wondering if there are any gems or ruby-specific solutions to this problem?

    Read the article

  • How DNS Works [Video]

    - by Jason Fitzpatrick
    Want an easy and visual way to explain DNS to a curious friend or cubemate? This clean and simple short video does a great job highlighting exactly what goes on during a typical DNS request. Last month we explained what DNS is and showed you why you might want to use alternate DNS servers; this short video serves as an excellent visual companion for our article. How DNS Works [YouTube] 8 Deadly Commands You Should Never Run on Linux 14 Special Google Searches That Show Instant Answers How To Create a Customized Windows 7 Installation Disc With Integrated Updates

    Read the article

< Previous Page | 332 333 334 335 336 337 338 339 340 341 342 343  | Next Page >