Search Results

Search found 15591 results on 624 pages for 'problems'.

Page 192/624 | < Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >

  • Conky window jumps to the top

    - by Scott Severance
    Occasionally, my Conky window jumps to the top and covers all other windows. The only way to solve it is to kill and restart Conky. This happens at seemingly random times while using Compiz features. It seems especially common while using the scale plugin's window picker, but no plugin consistently causes this problem every time. I've seen several questions that appear related on the surface. However, all those questions are solved by ensuring that Conky starts after Compiz. In my case, my problems occur even if Conky starts after Compiz.

    Read the article

  • is there a bug with restart in edubuntu 12.04

    - by Ket
    After a clean install of edubuntu 12.04 on an Acer AO531-h netbook, restart doesn't work. The process starts normally but just before it shuts down the netbook freezes and I have to force shut down. The command "sudo reboot" has the same problem. I have no issues with shut down, only with restart. I'm absolute beginner. Netbook specs: Acer, intel atom CPU N270 @1.60GHz, 1.05 GHz 0.98GB RAM Dual booting with windows xp. No problems with windows.

    Read the article

  • OSB, Service Callouts and OQL - Part 1

    - by Sabha
    Oracle Fusion Middleware customers use Oracle Service Bus (OSB) for virtualizing Service endpoints and implementing stateless service orchestrations. Behind the performance and speed of OSB, there are a couple of key design implementations that can affect application performance and behavior under heavy load. One of the heavily used feature in OSB is the Service Callout pipeline action for message enrichment and invoking multiple services as part of one single orchestration. Overuse of this feature, without understanding its internal implementation, can lead to serious problems. This post will delve into OSB internals, the problem associated with usage of Service Callout under high loads, diagnosing it via thread dump and heap dump analysis using tools like ThreadLogic and OQL (Object Query Language) and resolving it. The first section in the series will mainly cover the threading model used internally by OSB for implementing Route Vs. Service Callouts. Please refer to the blog post for more details. 

    Read the article

  • Cursor (touchpad) moves and clicks erratically

    - by James Wood
    Sometimes (usually after two-finger scrolling) the touchpad on my Asus X54C becomes unresponsive and the cursor begins to click and move small distances. Clicking seems to happen more often than moving. Unlike with other similar problems, I've never seen the cursor move to (0, 0). Suspending (closing the lid) and unsuspending doesn't help, and neither does moving to a tty and back or rebooting. I've also tried disabling the touchpad via Fn+F9. That tends to take a long time, but doesn't have any effect. I'm on 13.10 at the moment, but I remember it happening on 13.04 as well. Here's the pointer section of xinput: ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? ETPS/2 Elantech Touchpad id=12 [slave pointer (2)]

    Read the article

  • Audio playback: part of song is skipped

    - by Homulvas
    I am experiencing some problems with music playback after upgrading to Ubuntu 12.10. Basically some of the songs stop playing after some time as if the song has ended. It's always the same songs and the same time. The weird thing that it happens with Clementine and Totem but VLC doesn't have this problem and it also plays as it should on Windows. I'm guessing there might be a problem with some library that's shared with by the first two applications. I don't know if it's relevant but the file format of the audio files is flac(don't know if the problem affects mp3, because I don't have many of them).

    Read the article

  • eGalax Touchscreen not working Jolicloud 1.2

    - by craigsmith86
    I have an eGalax touchscreen on an Acer Aspire One running Jolicloud 1.2 I have had success getting this touchscreen to work correctly on ubuntu 10.04NBR, 11.04 and Kubuntu 12.04 and Puppy Linux, so I am pretty happy with how it SHOULD be done. However, I cannot get it to calibrate correctly or remember calibration settings. I have installed the eGalax utility (all available versions) and it does not recognize the screen. Xinput_calibrator works but the config cannot be made permanent. Problems I have identified: -Joli doesn't have an xorg.conf file and does not use xorg.conf.d for evdev configuration -Setting configs through Hal doesn't work anymore The best I can get is a poorly adjusted touchscreen with a reversed Y axis. Any help greatly appreciated

    Read the article

  • Cannot login via Unity login screen after upgrade to 12.04

    - by codesurgeon
    Logging in via the shell accessed through Ctrl+Alt-F1 and logging in as guest via the graphical user interface work 0O When I try to log into my standard user account via the graphical interface, the screen flashes to black for a couple of seconds and bumps me back to a pristine login screen. Entering a wrong password for my user account yields the standard error message - my user account and credential verification seem to be OK. I suppose that my individual graphics configuration causes problems ... I'm not sure how to reset that. I've tried stopping the UI via sudo service lightdm stop executed sudo nvidia-xconfig and restarted the UI sudo service lightdm start to no avail. My workstation has a Nvidia GeForce 560-448 graphics card. I've tried getting this fixed with the latest Nvidia 64-bit drivers (cURL'ed from the official website), that is 295.49 and the latest beta driver 302.07. Anybody have an idea how to get this fixed? Your help is appreciated :)

    Read the article

  • Laptop buttons not working on a Dell Latitude e6400

    - by Ido
    I have install Ubuntu 12.10 64 bit on my Dell Latitude E6400 laptop. I have the latest bios version for my laptop (A32) which I downloaded from the official Dell website by checking updates that matches the service tag on my laptop. I'm having serious problems with all the mouse buttons on my laptop. The right-click button doesn't work at all under any circumstance. The left click button only work when clicking on icons in the left side-bar and in the top menu (like clicking on the power icon in the top-right corner or clicking on the file menu). However none of the buttons work in any of the applications even simple one like the file browser (home folder). For example when I try to click on a folder inside nautilus it doesn't work or when I try to click on the "x" close icon to close the window it also doesn't work. Can you help me figure it out?

    Read the article

  • How do I improve my problem-solving ability

    - by gcc
    How can I improve my problem-solving ability? Every one says same thing "a real programmer knows how to handle real problem", but they forget how they learn this ability, or where (I know in school, no one gives us any ability, of course in my opinion). If you have any idea except above ones, feel free when you give your advice solve more problems do more exercises, write code, search google then write more ... For me, my question is like "Use complex/known library instead of using your own." In other words, I want your experience, book recommendation, web page on problem solving

    Read the article

  • Slerping rotation mirrors

    - by Esa
    I rotate my game character to watch at the target using the following code: transform.rotation = Quaternion.Slerp(startQuaternion, lookQuaternion, turningNormalizer*turningSpeed/10f) startQuaternion is the character's current rotation when a new target is given. lookQuaternion is the direction the character should look at and it's set like this: destinationVector = currentWaypoint.transform.position - transform.position; lookQuaternion = Quaternion.LookRotation(destinationVector, Vector3.up); turningNormalizer is just Time.deltaTime incremented and turningSpeed is a static value given in the editor. The problem is that while the character turns as it should most of the time, it has problems when it has to do close to 180 degrees. Then it at times jitters and mirrors the rotation: In this poorly drawn image the character(on the right) starts to turn towards the circle on the left. Instead of just turning either through left or right it starts this "mirror dance": It starts to rotate towards the new facing Then it suddenly snaps to the same angle but on other side and keeps rotating It does this "mirroring" so long until it looks at the target. Is this a thing with quaternions, slerping/lerping or something else?

    Read the article

  • shutdown logging in ubuntu 10.04 & 11.10

    - by Joe
    When my system starts up it logs everything into syslog/dmesg. And I can review it for problems. When my system shuts down, where does that get logged? I didn't see anything obvious in /var/log in 10.04. (My 11.10 system is out of reach at the moment.) I looked at How do I turn on 'shutdown logging' or operating system tracing? but didn't see anything that helped. I use kubuntu, but all of the stuff at this level is probably the same.

    Read the article

  • Framework Folders and Duplicate File Names

    - by Kevin Smith
    I have been working with Framework folders a little bit in the past few days and found one unexpected behavior that is different from Contribution Folders (Folders_g). If you try and check a file into a Framework Folder that already exists in the folder it will allow it and rename the file for you. In Folders_g this would have generated an error and prevented you from checking in the file. A quick check of the Framework Folder configuration settings in the Application Administrator’s Guide for Content Server does not show a configuration parameter to control this. I'm still thinking about this and not sure if I like this new behavior or not. I guess from a user perspective this more closely aligns Framework Folders to how Windows handle duplicate file names, but if you are migrating from Folders_g and expect a duplicate file name to be rejected, this might cause you some problems.

    Read the article

  • What is the situation about OpenGL under Ubuntu Unity and Gnome3?

    - by user827992
    In a GNU/linux distribution is usually installed Xorg as main graphical server, it operates with a client-server logic, a special windows is designate as desktop environment and this special windows can handle all the eyecandy stuff like decorations, icons and effects. The problem is that the latest UI heavily relies on hardware acceleration, Unity is an overlay on Compiz and the Gnome-shell also require an active driver for the GPU to work well: the problem is: on the same OS I can find multiple implementations of OpenGL who is handling my OpenGL buffer? how the OpenGL buffer is managed compared to the other windows? how can I be sure that my OpenGL implementation is glued to the hardware and is not related to the client-server logic of Xorg? For example I have tried the clutter library and I have only experienced problems under both Unity and GTK/Gnome, no problem under other OS.

    Read the article

  • How do you unit test your javascript

    - by Erin
    I spend a lot of time working in javascript of late. I have not found a way that seems to work well for testing javascript. This in the past hasn't been a problem for me since most of the websites I worked on had very little javascript in them. I now have a new website that makes extensive use of jQuery I would like to build unit tests for most of the system. My problems are this. Most of the functions make changes to the DOM in some way. Most of the functions request data from the web server as well and require a session on the service to get results back. I would like to run the test from either a command line or a test running harness rather then in a browser. Any help or articles I should be reading would be helpful.

    Read the article

  • Developing Schema Compare for Oracle (Part 6): 9i Query Performance

    - by Simon Cooper
    All throughout the EAP and beta versions of Schema Compare for Oracle, our main request was support for Oracle 9i. After releasing version 1.0 with support for 10g and 11g, our next step was then to get version 1.1 of SCfO out with support for 9i. However, there were some significant problems that we had to overcome first. This post will concentrate on query execution time. When we first tested SCfO on a 9i server, after accounting for various changes to the data dictionary, we found that database registration was taking a long time. And I mean a looooooong time. The same database that on 10g or 11g would take a couple of minutes to register would be taking upwards of 30 mins on 9i. Obviously, this is not ideal, so a poke around the query execution plans was required. As an example, let's take the table population query - the one that reads ALL_TABLES and joins it with a few other dictionary views to get us back our list of tables. On 10g, this query takes 5.6 seconds. On 9i, it takes 89.47 seconds. The difference in execution plan is even more dramatic - here's the (edited) execution plan on 10g: -------------------------------------------------------------------------------| Id | Operation | Name | Bytes | Cost |-------------------------------------------------------------------------------| 0 | SELECT STATEMENT | | 108K| 939 || 1 | SORT ORDER BY | | 108K| 939 || 2 | NESTED LOOPS OUTER | | 108K| 938 ||* 3 | HASH JOIN RIGHT OUTER | | 103K| 762 || 4 | VIEW | ALL_EXTERNAL_LOCATIONS | 2058 | 3 ||* 20 | HASH JOIN RIGHT OUTER | | 73472 | 759 || 21 | VIEW | ALL_EXTERNAL_TABLES | 2097 | 3 ||* 34 | HASH JOIN RIGHT OUTER | | 39920 | 755 || 35 | VIEW | ALL_MVIEWS | 51 | 7 || 58 | NESTED LOOPS OUTER | | 39104 | 748 || 59 | VIEW | ALL_TABLES | 6704 | 668 || 89 | VIEW PUSHED PREDICATE | ALL_TAB_COMMENTS | 2025 | 5 || 106 | VIEW | ALL_PART_TABLES | 277 | 11 |------------------------------------------------------------------------------- And the same query on 9i: -------------------------------------------------------------------------------| Id | Operation | Name | Bytes | Cost |-------------------------------------------------------------------------------| 0 | SELECT STATEMENT | | 16P| 55G|| 1 | SORT ORDER BY | | 16P| 55G|| 2 | NESTED LOOPS OUTER | | 16P| 862M|| 3 | NESTED LOOPS OUTER | | 5251G| 992K|| 4 | NESTED LOOPS OUTER | | 4243M| 2578 || 5 | NESTED LOOPS OUTER | | 2669K| 1440 ||* 6 | HASH JOIN OUTER | | 398K| 302 || 7 | VIEW | ALL_TABLES | 342K| 276 || 29 | VIEW | ALL_MVIEWS | 51 | 20 ||* 50 | VIEW PUSHED PREDICATE | ALL_TAB_COMMENTS | 2043 | ||* 66 | VIEW PUSHED PREDICATE | ALL_EXTERNAL_TABLES | 1777K| ||* 80 | VIEW PUSHED PREDICATE | ALL_EXTERNAL_LOCATIONS | 1744K| ||* 96 | VIEW | ALL_PART_TABLES | 852K| |------------------------------------------------------------------------------- Have a look at the cost column. 10g's overall query cost is 939, and 9i is 55,000,000,000 (or more precisely, 55,496,472,769). It's also having to process far more data. What on earth could be causing this huge difference in query cost? After trawling through the '10g New Features' documentation, we found item 1.9.2.21. Before 10g, Oracle advised that you do not collect statistics on data dictionary objects. From 10g, it advised that you do collect statistics on the data dictionary; for our queries, Oracle therefore knows what sort of data is in the dictionary tables, and so can generate an efficient execution plan. On 9i, no statistics are present on the system tables, so Oracle has to use the Rule Based Optimizer, which turns most LEFT JOINs into nested loops. If we force 9i to use hash joins, like 10g, we get a much better plan: -------------------------------------------------------------------------------| Id | Operation | Name | Bytes | Cost |-------------------------------------------------------------------------------| 0 | SELECT STATEMENT | | 7587K| 3704 || 1 | SORT ORDER BY | | 7587K| 3704 ||* 2 | HASH JOIN OUTER | | 7587K| 822 ||* 3 | HASH JOIN OUTER | | 5262K| 616 ||* 4 | HASH JOIN OUTER | | 2980K| 465 ||* 5 | HASH JOIN OUTER | | 710K| 432 ||* 6 | HASH JOIN OUTER | | 398K| 302 || 7 | VIEW | ALL_TABLES | 342K| 276 || 29 | VIEW | ALL_MVIEWS | 51 | 20 || 50 | VIEW | ALL_PART_TABLES | 852K| 104 || 78 | VIEW | ALL_TAB_COMMENTS | 2043 | 14 || 93 | VIEW | ALL_EXTERNAL_LOCATIONS | 1744K| 31 || 106 | VIEW | ALL_EXTERNAL_TABLES | 1777K| 28 |------------------------------------------------------------------------------- That's much more like it. This drops the execution time down to 24 seconds. Not as good as 10g, but still an improvement. There are still several problems with this, however. 10g introduced a new join method - a right outer hash join (used in the first execution plan). The 9i query optimizer doesn't have this option available, so forcing a hash join means it has to hash the ALL_TABLES table, and furthermore re-hash it for every hash join in the execution plan; this could be thousands and thousands of rows. And although forcing hash joins somewhat alleviates this problem on our test systems, there's no guarantee that this will improve the execution time on customers' systems; it may even increase the time it takes (say, if all their tables are partitioned, or they've got a lot of materialized views). Ideally, we would want a solution that provides a speedup whatever the input. To try and get some ideas, we asked some oracle performance specialists to see if they had any ideas or tips. Their recommendation was to add a hidden hook into the product that allowed users to specify their own query hints, or even rewrite the queries entirely. However, we would prefer not to take that approach; as well as a lot of new infrastructure & a rewrite of the population code, it would have meant that any users of 9i would have to spend some time optimizing it to get it working on their system before they could use the product. Another approach was needed. All our population queries have a very specific pattern - a base table provides most of the information we need (ALL_TABLES for tables, or ALL_TAB_COLS for columns) and we do a left join to extra subsidiary tables that fill in gaps (for instance, ALL_PART_TABLES for partition information). All the left joins use the same set of columns to join on (typically the object owner & name), so we could re-use the hash information for each join, rather than re-hashing the same columns for every join. To allow us to do this, along with various other performance improvements that could be done for the specific query pattern we were using, we read all the tables individually and do a hash join on the client. Fortunately, this 'pure' algorithmic problem is the kind that can be very well optimized for expected real-world situations; as well as storing row data we're not using in the hash key on disk, we use very specific memory-efficient data structures to store all the information we need. This allows us to achieve a database population time that is as fast as on 10g, and even (in some situations) slightly faster, and a memory overhead of roughly 150 bytes per row of data in the result set (for schemas with 10,000 tables in that means an extra 1.4MB memory being used during population). Next: fun with the 9i dictionary views.

    Read the article

  • Is Ubuntu running well on an usb hdd? Need suggestions

    - by Klaus
    Dear Linux and Ubuntu pros, I have here a company notebook, and because the hdd is full encrypted I cannot install an extra partition for another system that I would like to use in my free time. And I really need another system, because this crap windows here with that much of antivirus, antispyware, anti-whatever on it is sooo slow and anoying. What can I do? I could use an external usb hdd with another system. Because I would like to handle big files and so on, I dont want to use an sub stick. An usb 2.5hdd + ubuntu is what I think the best option. Here are my question: Do I have to note something? Is Ubuntu running well on an external hdd? Do I have big performance problems (because of the usb hdd)? Should I buy a very fast hdd for much money or is it not that important? Any suggestions? Thank you :)

    Read the article

  • Regular wireless dropouts on new lenovo T440s in Ubuntu 14.04

    - by user290670
    Over the last 24 hours my wireless has dropped out regularly. I've tested to make sure it isn't my router (my phone and everyone in the house isn't having problems using the wifi). This is a brand-new installation of Ubuntu 14.04 and according to uname I'm running kernel 3.13.0-24-generic. Now, my laptop has an Intel 7260AC dual band wireless card and I've read that Ubuntu has been having trouble with these. I notice that at http://www.intel.com/support/wireless/wlan/sb/CS-034398.htm There are some updated drivers for my kernel version. However I have no idea how to update the kernel to use these drivers instead so that I can see if this will fix it. Can anyone help? These dropouts are really annoying. EDIT: Upgrading to 3.14.6 did not help.

    Read the article

  • Wireless Connection Troubles

    - by James
    I just recently switched from Windows 7 over to Ubuntu 12.04 and have been experiencing some issues connecting to my home's wireless network. The only way I can get it to connect to the network is by disabling IPv4 and IPv6 settings. Even then while it says its connected to the network (3 bars), I'm unable to access the Internet. It connected for a little while after I first installed Ubuntu, but after the first reboot I haven't been able to access the web at all. I have very basic knowledge when it comes to computers and barely any when dealing with Ubuntu and Linux. I'm very happy with Ubuntu apart from this one issue, as before my computer was overheating and crashing, I've yet to experience any of those problems since installing Ubuntu. The information I can give may be very limited since I'm having to use my cell phone to figure out the solution to this. Any help would be greatly appreciated. Thanks in advance!

    Read the article

  • Ubuntu 12.10 desktop/interface not showing on VirtualBox VM after login screen

    - by Jake
    I'm having some trouble with getting my Ubuntu to work on a VirtualBox VM. I made a clean installation of Ubuntu 12.10 on a VM without any errors. I arrive at the login screen, as soon as I press enter it does it's little loading thingy and then screen goes black, then this is all I get: http://i.imgur.com/zULUI.jpg I can access the terminal and pretty much all the other features through it, but I would like to have the GUI properly working. I've been looking around the web and looking at various fixes to similar problems, but can't seem to get it to work. I'm thinking this problem might have to do with the graphics? I'm running Windows 8 Pro as host, if that helps, might be some compatibility issues with VirtualBox in W8... Thanks in advance!

    Read the article

  • My laptop hangs a lot

    - by Salahuddin
    My laptop is 1G ram and 120G hard-disk 1.68HZ processor , I've both Ubuntu and Windows7 on my machine For the last few days, my laptop, running Ubuntu 11.10 has been hanging a lot. A lot of the time, the touchpad won't work for unknown reasons, and I have to connect a USB mouse to be able to move the pointer on the screen. When browsing the web, the browser hangs a lot, and I have to restart the laptop to refresh it. I know that there is a hotkey to force Ubuntu to refresh when it hangs, but it doesn't work here.These problems happens only on ubuntu not on windows 7... How do I make Ubuntu light and fast? Help me, please.

    Read the article

  • Good application for taking notes on 12.04?

    - by Ankit
    Is there a good note taking application in ubuntu like Evernote exists for windows and Mac users. Requirements for good applications:- A thumbnail/list preview for all the created notes. An integration with the email address so that we can view the notes anywhere anytime. Any easy way to input text, video, pdf, images etc. I have tried tomboy, basket notes. They aren't that good. Edit: Trying to install nevernote from an external source. But it has not satisfiable Further trying to install libssl gives the following error The following information may help to resolve the situation: The following packages have unmet dependencies: libssreflect-coq : Depends: coq-8.3pl3+3.12.1 but it is not installable libssreflect-ocaml : Depends: libcoq-ocaml-4zyg6 but it is not installable libssreflect-ocaml-dev : Depends: libcoq-ocaml-dev-4zyg6 but it is not installable E: Unable to correct problems, you have held broken packages. Anything that I might be doing wrong?

    Read the article

  • 14.04 Chinese Ibus Input - No Options

    - by RhZ
    getting my new 14.04 rig going ;-) Pretty happy with it, everything seems to be working great. For Chinese input, however, having a problem. I went through the typical steps, open language in settings, let it install some stuff, then add Chinese and choose Ibus. Then, after logging in and out, I see the language icon in my system tray. However, when I go to add the Chinese into Ibus, it only lets me choose "Chinese", which isn't an input method. It should give me a bunch of choices like pinyin, bopomo or whatever, I only use pinyin so don't know those names. I saw someone online had a little command which helped people with similar problems, but it did not work for me, even after a re-start. So, anyone got a solution? Edit: Here is what it looks like, just "Chinese" in the list, when there should be a bunch of input options like pinyin, potomofo, and so on.

    Read the article

  • How To View Upcoming Weather, Sports Games, TV Shows, and More in Google Calendar

    - by Chris Hoffman
    Google Calendar isn’t just a tool to keep track of your own events. You can subscribe to a number of special calendars that automatically update with the latest weather, sports games, air times for your favorite TV shows, and more. This is the sort of thing that a paper calendar could never do, and what makes digital calendars like Google Calendar so useful. Add some automatically updating calendars and you’ll wonder how people ever used paper calendars. HTG Explains: What is the Windows Page File and Should You Disable It? How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems

    Read the article

  • How do you forcibly unmount a disk when you press the eject button on an optical drive?

    - by Michael Curran
    When upgrading my hardware, I also upgraded to Ubuntu 10.10. On my previous system (with 10.04 and earlier) when I ejected a disk from the optical drive, the subfolder in the /media directory was automatically removed. In my new 10.10 system, if I don't eject the disk using the "eject" command within the system, the disk remains mounted, even after a new disk is installed. The new drive is a Blu Ray drive, but I haven't noticed any other problems from it. Normally, this isn't a problem, but it makes installing applications that are spread over multiple CDs more difficult in many cases (i.e. Wine). Any advice?

    Read the article

  • Business Case for investing time developing Stubs and BizUnit Tests

    - by charlie.mott
    I was recently in a position where I had to justify why effort should be spent developing Stubbed Integration Tests for BizTalk solutions. These tests are usually developed using the BizUnit framework. I assumed that most seasoned BizTalk developers would consider this best practice. Even though Microsoft suggest use of BizUnit on MSDN, I've not found a single site listing the justifications for investing time writing stubs and BizUnit tests. Stubs Stubs should be developed to isolate your development team from external dependencies. This is described by Michael Stephenson here. Failing to do this can result in the following problems: In contract-first scenarios, the external system interface will have been defined.  But the interface may not have been setup or even developed yet for the BizTalk developers to work with. By the time you open the target location to see the data BizTalk has sent, it may have been swept away. If you are relying on the UI of the target system to see the data BizTalk has sent, what do you do if it fails to arrive? It may take time for the data to be processed or it may be scheduled to be processed later. Learning how to use the source\target systems and investigations into where things go wrong in these systems will slow down the BizTalk development effort. By the time the data is visible in a UI it may have undergone further transformations. In larger development teams working together, do you all use the same source and target instances. How do you know which data was created by whose tests? How do you know which event log error message are whose?  Another developer may have “cleaned up” your data. It is harder to write BizUnit tests that clean up the data\logs after each test run. What if your B2B partners' source or target system cannot support the sort of testing you want to do. They may not even have a development or test instance that you can work with. Their single test instance may be used by the SIT\UAT teams. There may be licencing costs of setting up an instances of the external system. The stubs I like to use are generic stubs that can accept\return any message type.  Usually I need to create one per protocol. They should be driven by BizUnit steps to: validates the data received; and select a response messages (or error response). Once built, they can be re-used for many integration tests and from project to project. I’m not saying that developers should never test against a real instance.  Every so often, you still need to connect to real developer or test instances of the source and target endpoints\services. The interface developers may ask you to send them some data to see if everything still works.  Or you might want some messages sent to BizTalk to get confidence that everything still works beyond BizTalk. Tests Automated “Stubbed Integration Tests” are usually built using the BizUnit framework. These facilitate testing of the entire integration process from source stub to target stub. It will ensure that all of the BizTalk components are configured together correctly to meet all the requirements. More fine grained unit testing of individual BizTalk components is still encouraged.  But BizUnit provides much the easiest way to test some components types (e.g. Orchestrations). Using BizUnit with the Behaviour Driven Development approach described by Mike Stephenson delivers the following benefits: source: http://biztalkbddsample.codeplex.com – Video 1. Requirements can be easily defined using Given/When/Then Requirements are close to the code so easier to manage as features and scenarios Requirements are defined in domain language The feature files can be used as part of the documentation The documentation is accurate to the build of code and can be published with a release The scenarios are effective to document the scenarios and are not over excessive The scenarios are maintained with the code There’s an abstraction between the intention and implementation of tests making them easier to understand The requirements drive the testing These same tests can also be used to drive load testing as described here. If you don't do this ... If you don't follow the above “Stubbed Integration Tests” approach, the developer will need to manually trigger the tests. This has the following risks: Developers are unlikely to check all the scenarios each time and all the expected conditions each time. After the developer leaves, these manual test steps may be lost. What test scenarios are there?  What test messages did they use for each scenario? There is no mechanism to prove adequate test coverage. A test team may attempt to automate integration test scenarios in a test environment through the triggering of tests from a source system UI. If this is a replacement for BizUnit tests, then this carries the following risks: It moves the tests downstream, so problems will be found later in the process. Testers may not check all the expected conditions within the BizTalk infrastructure such as: event logs, suspended messages, etc. These automated tests may also get in the way of manual tests run on these environments.

    Read the article

< Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >