Search Results

Search found 31968 results on 1279 pages for 'database training'.

Page 512/1279 | < Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >

  • Sign on Experience with Office 365

    - by Sahil Malik
    SharePoint 2010 Training: more information Office 365 offers two types of identities: · Microsoft Online Services cloud IDs (Cloud Identity): This is the default identity Microsoft provides you, requires no additional setup, you sign up for Office 365 and you are provided a credential. You can sign in using forms based authentication, the password policy etc. for which is stored in the cloud with the Office 365 service. The advantage obviously is no additional setup headache. The disadvantage? Yet another password to remember, and no hope of authenticated single sign on integration using this cloud identity with other services at least in the current version. · Federated IDs (Federated Identity): In companies with on-premises Active Directory, users can sign into Office 365 services using their Active Directory credentials. The corporate Active Directory authenticates the users, and stores and controls the password policy. The advantage here is plenty of single sign on possibilities and better user experience. The downside, more Read full article ....

    Read the article

  • Stumbling Through: Making a case for the K2 Case Management Framework

    I have recently attended a three-day training session on K2s Case Management Framework (CMF), a free framework built on top of K2s blackpearl workflow product, and I have come away with several different impressions for some of the different aspects of the framework.  Before we get into the details, what is the Case Management Framework?  It is essentially a suite of tools that, when used together, solve many common workflow scenarios.  The tool has been developed over time by K2 consultants that have realized they tend to solve the same problems over and over for various clients, so they attempted to package all of those common solutions into one framework.  Most of these common problems involve workflow process that arent necessarily direct and would tend to be difficult to model.  Such solutions could be achieved in blackpearl alone, but the workflows would be complex and difficult to follow and maintain over time.  CMF attempts to simplify such scenarios not so much by black-boxing the workflow processes, but by providing different points of entry to the processes allowing them to be simpler, moving the complexity to a middle layer.  It is not a solution in and of itself, development is still required to tie the pieces together. CMF is under continuous development, both a plus and a minus in that bugs are fixed quickly and features added regularly, but it may be difficult to know which versions are the most stable.  CMF is not an officially supported K2 product, which means you will not get technical support but you will get access to the source code. The example given of a business process that would fit well into CMF is that of a file cabinet, where each folder in said file cabinet is a case that contains all of the data associated with one complaint/customer/incident/etc. and various users can access that case at any time and take one of a set of pre-determined actions on it.  When I was given that example, my first thought was that any workflow I have ever developed in the past could be made to fit this model there must be more than just this model to help decide if CMF is the right solution.  As the training went on, we learned that one of the key features of CMF is SharePoint integration as each case gets a SharePoint site created for it, and there are a number of excellent web parts that can be used to design a portal for users to get at all the information on their cases.  While CMF does not require SharePoint, without it you will be missing out on a huge portion of functionality that CMF offers.  My opinion is that without SharePoint integration, you may as well write your workflows and other components the old fashioned way. When I heard that each case gets its own SharePoint site created for it, warning bells immediately went off in my head as I felt that depending on the data load, a CMF enabled solution could quickly overwhelm SharePoint with thousands of sites so we have yet another deciding factor for CMF:  Just how many cases will your solution be creating?  While it is not necessary to use the site-per-case model, it is one of the more useful parts of the framework.  Without it, you are losing a big chunk of what CMF has to offer. When it comes to developing on top of the Case Management Framework, it becomes a matter of configuring what makes up a case, what can be done to a case, where each action on a case should take the user, and then typing up actions to case statuses.  This last step is one that I immediately warmed up to, as just about every workflow Ive designed in the past needed some sort of mapping table to set the status of a work item based on the action being taken definitely one of those common solutions that it is good to see rolled up into a re-useable entity (and it gets a nice configuration UI to boot!).  This concept is a little different than traditional workflow design, in that you dont have to think of an end-to-end process around passing a case along a path, rather, you must envision the case as central object with workflow threads branching off of it and doing their own thing with the case data.  Certainly there can be certain workflow threads that get rather complex, but the idea is that they RELATE to the case, they dont BECOME the case (though it is still possible with action->status mappings to prevent certain actions in certain cases, so it isnt always a wide-open free for all of actions on a case). I realize that this description of the Case Management Framework merely scratches the surface on what the product actually can do, and I dont think Ive conclusively defined for what sort of business scenario you can make a case for Case Management Framework.  What I do hope to have accomplished with this post is to raise awareness of CMF there is a (free!) product out there that could potentially simplify a tangled workflow process and give (for free!) a very useful set of SharePoint web parts and a nice set of (free!) reports.  The best way to see if it will truly fit your needs is to give it a try did I mention it is FREE?  Er, ok, so it is free, but only obtainable at this time for K2 partnersDid you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What Counts For A DBA: Foresight

    - by drsql
    Of all the valuable attributes of a DBA covered so far in this series, ranging from passion to humility to practicality, perhaps one of the most important attributes may turn out to be the most seemingly-nebulous: foresight. According to Free Dictionary foresight is the "perception of the significance and nature of events before they have occurred". Foresight does not come naturally to most people, as the parent of any teenager will attest. No matter how clearly you see their problems coming they won't listen, and have to fail before eventually (hopefully) learning for themselves. Having graduated from the school of hard knocks, the DBA, the naive teenager no longer, acquires the ability to foretell how events will unfold in response to certain actions or attitudes with the unerring accuracy of a doom-laden prophet. Like Simba in the Lion King, after a few blows to the head, we foretell that a sore head that will be the inevitable consequence of a swing of Rafiki's stick, and we take evasive action. However, foresight is about more than simply learning when to duck. It's about taking the time to understand and prevent the habits that caused the stick to swing in the first place. And based on this definition, I often think there is a lot less foresight on display in my industry than there ought to be. Most DBAs reading this blog will spot a line such as the following in a piece of "working" code, understand immediately why it is less than optimimum, and take evasive action. …WHERE CAST (columnName as int) = 1 However, the programmers who regularly write this sort of code clearly lack that foresight, and this and numerous other examples of similarly-malodorous code prevail throughout our industry (and provide premium-grade fertilizer for the healthy growth of many a consultant's bank account). Sometimes, perhaps harried by impatient managers and painfully tight deadlines, everyone makes mistakes. Yes, I too occasionally write code that "works", but basically stinks. When the problems manifest, it is sometimes accompanied by a sense of grim recognition that somewhere in me existed the foresight to know that that approach would lead to this problem. However, in the headlong rush, warning signs got overlooked, lessons learned previously, which could supply the foresight to the current project, were lost and not applied.   Of course, the problem often is a simple lack of skills, training and knowledge in the relevant technology and/or business space; programmers and DBAs forced to do their best in the face of inadequate training, or to apply their skills in areas where they lack experience. However, often the problem goes deeper than this; I detect in some DBAs and programmers a certain laziness of attitude.   They veer from one project to the next, going with "whatever works", unwilling or unable to take the time to understand where their actions are leading them. Of course, the whole "Agile" mindset is often interpreted to favor flexibility and rapid production over aiming to get things right the first time. The faster you try to travel in the dark, frequently changing direction, the more important it is to have someone who has the foresight to know at least roughly where you are heading. This is doubly true for the data tier which, no matter how you try to deny it, simply cannot be "redone" every month as you learn aspects of the world you are trying to model that, with a little bit of foresight, you would have seen coming.   Sometimes, when as a DBA you can glance briefly at 200 lines of working SQL code and know instinctively why it will cause problems, foresight can feel like magic, but it isn't; it's more like muscle memory. It is acquired as the consequence of good experience, useful communication with those around you, and a willingness to learn continually, through continued education as well as from failure. Foresight can be deployed only by finding time to understand how the lessons learned from other DBAs, and other projects, can help steer the current project in the right direction.   C.S. Lewis once said "The future is something which everyone reaches at the rate of sixty minutes an hour, whatever he does, whoever he is." It cannot be avoided; the quality of what you build now is going to affect you, and others, at some point in the future. Take the time to acquire foresight; it is a love letter to your future self, to say you cared.

    Read the article

  • Heterogén adatelérés OWB-vel: ODI EE Enterprise ETL

    - by Fekete Zoltán
    Az elozo ketto blogbejegyzéshez kapcsolódva felmerül a kérdés: Hogyan lehet az Oracle Warehouse Builderrel heterogén adatforrásokat elérni? Ajánlott olvasmány: Oracle Warehouse Builder 11gR2: OWB ETL Using ODI Knowledge Modules Természetesen az OWB az Oracle Database Heterogeneous Services-zel ODBC-vel illetve Oracle Gateway-k alkalmazásával eddig is lehetett mindenféle ODBC kompatibilis továbbá mainframe-es adatbázisokat elérni. Oracle Database Gateways: MS SQL Server, Sybase, Teradata, Informix, ODBC, DRDA, APPC, WebSphere MQ, DB2, DB2/400. A megfelelo Application Adapters megvásárlásával lehet csatlakozni az OWB-vel például a következo forrásokhoz: SAP, Oracle E-Business Suite, Peoplesoft, Siebel, Oracle Customer Data Hub (CDH), Universal Customer Master (UCM), Product Information Management (PIM). Az OWB 11gR2-tol kezdve az OWB tudja használni az Oracle Data Integrator Knowledge moduljait a heterogén adatelérésre, ez JDBC-vel illetve más heterogén elérési módokkal. Ajánlott olvasmány: Oracle Warehouse Builder 11gR2: OWB ETL Using ODI Knowledge Modules Letöltés: Oracle Warehouse Builder. BTW az OWB Java-s kliens szoftver Linux-on és Windows-on is használható. A szerver oldal pedig természetesen az Oracle adatbázisban fut: Solaris, Linux, HP-UX, AIX, Windows operációs rendszereken.

    Read the article

  • Oracle E-Business Supply Chain Suite Release 12.1.2: Latest & Greatest!

    - by [email protected]
    This week we hosted one of several planned orientation and training sessions for the ASR/ASM sales community.  The purpose of the session was to orient our contact center and marketing associates with the 'hotpoints' of the latest release and to provide a few 'snippets' for the scheduled 'call-down' to the installed base.  Oracle EBS Release 12.1.2 contains some of the most powerful supply chain applications technology available to the industrial, commercial and public sector communities.  They should all be taking advantage of this great capability to drive margins, control costs and achieve compliance.   In today's changing business landscape, organizations need competitive advantage and we see that R12 provides this capability according to our customers leveraging the upgrade.

    Read the article

  • Which is better : Storing/retrieving images on/from SQL server or in a directory on server

    - by Pankaj Upadhyay
    I am working on a project in Asp.net MVC and need to work with images. There is an SQL database with a Product table. Every product in the table will have it's own image. I have two ways to do this : 1) Save the image in a web directory and store the URL on database. 2) Store the image in SQL itself in binary format and then retrieve it. Which is a better approach ? Mind you, I have no idea how second method works :-P . I will only learn this if there are merits to the second method

    Read the article

  • Consumer Electronics Show (CES):CRM for High Technology Firms

    - by charles.knapp
    The Consumer Electronics Show, opening Thursday, showcases product innovations that stem from best practices in design, manufacturing, and distribution. Oracle and IBM invite you to learn best practices from peers, as well as why it matters to use CRM tailored for high technology firms -- offered only by Oracle. On Wednesday, January 5, 1-7 pm at the Bellagio Hotel Las Vegas, learn from peers at IBM, VTech, Plantronics, Cisco, Symantec, and Oracle about how to improve:Channel sales, marketing, and operations management - maximize new product introductions (NPI), sales, forecasts, training, channel promotions, and settlement Winning the deal - determine the right price for the right deal for the "perfect quote," capture the order, and manage orders Collaborative and rapid supply chain planning - improve agility, inventory turns, and profits Please join us for the Oracle/IBM CES High Technology Summit and make useful connections with your peers at the evening networking reception. Register now for this FREE event.

    Read the article

  • PASS Summit 2011 &ndash; Part III

    - by Tara Kizer
    Well we’re about a month past PASS Summit 2011, and yet I haven’t finished blogging my notes! Between work and home life, I haven’t been able to come up for air in a bit.  Now on to my notes… On Thursday of the PASS Summit 2011, I attended Klaus Aschenbrenner’s (blog|twitter) “Advanced SQL Server 2008 Troubleshooting”, Joe Webb’s (blog|twitter) “SQL Server Locking & Blocking Made Simple”, Kalen Delaney’s (blog|twitter) “What Happened? Exploring the Plan Cache”, and Paul Randal’s (blog|twitter) “More DBA Mythbusters”.  I think my head grew two times in size from the Thursday sessions.  Just WOW! I took a ton of notes in Klaus' session.  He took a deep dive into how to troubleshoot performance problems.  Here is how he goes about solving a performance problem: Start by checking the wait stats DMV System health Memory issues I/O issues I normally start with blocking and then hit the wait stats.  Here’s the wait stat query (Paul Randal’s) that I use when working on a performance problem.  He highlighted a few waits to be aware of such as WRITELOG (indicates IO subsystem problem), SOS_SCHEDULER_YIELD (indicates CPU problem), and PAGEIOLATCH_XX (indicates an IO subsystem problem or a buffer pool problem).  Regarding memory issues, Klaus recommended that as a bare minimum, one should set the “max server memory (MB)” in sp_configure to 2GB or 10% reserved for the OS (whichever comes first).  This is just a starting point though! Regarding I/O issues, Klaus talked about disk partition alignment, which can improve SQL I/O performance by up to 100%.  You should use 64kb for NTFS cluster, and it’s automatic in Windows 2008 R2. Joe’s locking and blocking presentation was a good session to really clear up the fog in my mind about locking.  One takeaway that I had no idea could be done was that you can set a timeout in T-SQL code view LOCK_TIMEOUT.  If you do this via the application, you should trap error 1222. Kalen’s session went into execution plans.  The minimum size of a plan is 24k.  This adds up fast especially if you have a lot of plans that don’t get reused much.  You can use sys.dm_exec_cached_plans to check how often a plan is being reused by checking the usecounts column.  She said that we can use DBCC FLUSHPROCINDB to clear out the stored procedure cache for a specific database.  I didn’t know we had this available, so this was great to hear.  This will be less intrusive when an emergency comes up where I’ve needed to run DBCC FREEPROCCACHE. Kalen said one should enable “optimize for ad hoc workloads” if you have an adhoc loc.  This stores only a 300-byte stub of the first plan, and if it gets run again, it’ll store the whole thing.  This helps with plan cache bloat.  I have a lot of systems that use prepared statements, and Kalen says we simulate those calls by using sp_executesql.  Cool! Paul did a series of posts last year to debunk various myths and misconceptions around SQL Server.  He continues to debunk things via “DBA Mythbusters”.  You can get a PDF of a bunch of these here.  One of the myths he went over is the number of tempdb data files that you should have.  Back in 2000, the recommendation was to have as many tempdb data files as there are CPU cores on your server.  This no longer holds true due to the numerous cores we have on our servers.  Paul says you should start out with 1/4 to 1/2 the number of cores and work your way up from there.  BUT!  Paul likes what Bob Ward (twitter) says on this topic: 8 or less cores –> set number of files equal to the number of cores Greater than 8 cores –> start with 8 files and increase in blocks of 4 One common myth out there is to set your MAXDOP to 1 for an OLTP workload with high CXPACKET waits.  Instead of that, dig deeper first.  Look for missing indexes, out-of-date statistics, increase the “cost threshold for parallelism” setting, and perhaps set MAXDOP at the query level.  Paul stressed that you should not plan a backup strategy but instead plan a restore strategy.  What are your recoverability requirements?  Once you know that, now plan out your backups. As Paul always does, he talked about DBCC CHECKDB.  He said how fabulous it is.  I didn’t want to interrupt the presentation, so after his session had ended, I asked Paul about the need to run DBCC CHECKDB on your mirror systems.  You could have data corruption occur at the mirror and not at the principal server.  If you aren’t checking for data corruption on your mirror systems, you could be failing over to a corrupt database in the case of a disaster or even a planned failover.  You can’t run DBCC CHECKDB against the mirrored database, but you can run it against a snapshot off the mirrored database.

    Read the article

  • Is WCF suitable for writing an application which is shared among applications?

    - by RPK
    I have developed and deployed few ASP.NET applications. Sometimes I want to stop the users from either inserting or updating a record when: Maintenance is going on. Stop operations due to payment delay. In one of my recent application I have implemented this feature to first check the database operations for locked status. If any of the above condition fulfils, database operations like insert and update are not carried out. I now need this feature in all the old applications and the future applications I build. I want to know whether WCF is suitable in this scenario as I want to share methods or an independent locking application among various other applications. Is WCF appropriate for this type of scenario?

    Read the article

  • Klar im Vorteil mit Oracle Enablement 2.0!

    - by A&C Redaktion
    Oracle Enablement 2.0 enthält Schulungsangebote, die Oracle Partner angepasst an Ihre jeweilige Arbeitssituation vor Ort oder online effizient nutzen können. All diese Angebote unterstützen Oracle Partner bei der Entwicklung und dem Ausbau Ihrer Vertriebsstärke sowie zur Vertiefung der Implementierungskenntnisse.Bleiben Sie als Oracle Partner immer am Ball und informieren Sie sich regelmäßig, wie Sie notwendiges Know-How für die OPN Spezialisierung und die zugehörigen Assessments im Unternehmen aufbauen können.Das Oracle Country Enablement Team hilft Oracle Partnern bei der Spezialisierungsausbildung und der individuellen Beratung. Aktuelle Informationen zu Training und Spezialisierung finden Sie auf unserem Enablement Blog, den Frank Lauer und Corry Weick Ihnen im Video kurz vorstellen.

    Read the article

  • SharePoint 2010 Hosting :: Error Message - Your search cannot be completed because this site is not assigned to an indexer

    - by mbridge
    Error when we are trying to access search in sharepoint site: "Your search cannot be completed because this site is not assigned to an indexer. Contact your administrator for more information." Solving Problem: 1. Go to SharePoint Central Administration > Application Management > Content Databases (Underneath SharePoint Web Application Management). 2. Select the correct SharePoint web application – click on the name of the Content databases  - this will open the  “Manage Content Database Settings” page. 3. Make sure that the Search Server is set on the “Manage Content Database Settings” page. Hope it helps!!

    Read the article

  • June IOUG events

    - by Mandy Ho
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Independent Oracle User Group (IOUG) Regional Events: June 11-12, 2012 – Broomfield, CO 2-Day Seminar- “ High Performance PL/SQL & Oracle Database 11g New Features” Steven Feuerstein, generally considered the world’s leading PL/SQL expert, will be presenting his all-new, 2-day, “Higher Performance PL/SQL and Oracle 11g PL/SQL New Features” seminar on June 11 & 12 at Level 3 Communications in Broomfield, Colorado.  This will be Steven’s first Denver seminar in almost 4  years.  Who knows when he will offer another? http://www.rmoug.org/ June 14, 2012 – Ottawa, Ontario Pythian’s Gwen Shapira puts on 3 great presentations focused on NoSQL, making OLTP run fast and Big Data. http://www.oug-ottawa.org/pls/htmldb/f?p=327:27:1317735724699447::NO June 21, 2012 – Calgary, Alberta Big Data and Extreme Analytics Summit http://coug.ab.ca/ June 22, 2012 – Westborough, MA 10 Things You Probably Did Not Know? With Tom Kyte PL/SQL turns 23 years old this year. It was first introduced in 1988 with Oracle6 Database. This session looks at five technical things about PL/SQL you probably did not know: under-the-covers features that make PL/SQL quite simply the most efficient language with which to process data in the database. http://noug.com/  June 28/29, 2012 – Plano, Texas Jonathan Lewis Oracle Performance Seminars The DOUG (DALLAS ORACLE USERS GROUP) has invited SpeakTech to return to Dallas, and they’re bringing Jonathan Lewis! Topics are Beating the Oracle Optimizer – June 28, 2012, Trouble Shooting & Tuning – June 29, 2012 http://www.eventbrite.com/event/3082448687

    Read the article

  • "Oracle Certified Expert, Java Platform, Enterprise Edition 6 Java Persistence API Developer" Preparation

    - by Matt
    I have been working with Hibernate for a fews years now, and I want to solidify and demonstrate my knowledge by taking the Oracle JPA certification, also known as: "Oracle Certified Expert, Java Platform, Enterprise Edition 6 Java Persistence API Developer (CX-310-094)" There is a training course provided by Oracle: "Building Database Driven Applications with JPA (SL-370-EE6)" But this costs $1800 and I think it would be overkill for my needs. Ideally, I would like a self study guide that will cover everything in the exam. I have looked for books and these seem like possibilities: Pro JPA 2: Mastering the Java Persistence API (Expert's Voice in Java Technology) and Beginning Java EE 6 with GlassFish 3 2nd Edition (Expert's Voice in Java Technology) But these aren't checklist type study guides as far as I am aware. I found the official SCJP study guide very useful, but I think the equivalent text for the JPA exam isn't out yet. If anyone has taken this exam, I would be grateful to hear how you prepared for it. Thanks!

    Read the article

  • Phoenix Silverlight UserGroup tonight

    - by Dave Campbell
    Yes it really is the first Wednesday of the month! and Yes, we are having our normally scheduled meeting tonight. Everyone should be back from any Memorial Day festivities, dragging your collective drawers at work and looking for somethign interesting. Voila!... the Silverlight User Group meeting from 6PM to 8PM at Interface Technical Training (NW corner of Central & Thomas) -- check PhoenixSilverlight.net for a map. Come out for Pizza and networking at 6 then hang around because Les Brown of Sogeti is returning to speak on the new features in in .NET 4.0 as a project-based demo. I've got some fun give-aways, so come on out... it'll be too hot on tonight to be anywhere else :)

    Read the article

  • Adventures in Scrum: Lesson 1 &ndash; The failed Sprint

    - by Martin Hinshelwood
    I recently had a conversation with a product owner that wanted to have the Scrum team broken up into smaller units so that less time was wasted on the Scrum Ceremonies! Their complaint was around the need in Scrum to have the entire “Team” (7+-2) involved in the sizing of the work during the “Sprint Planning Meeting”.  The standard flippant answer of all Scrum professionals, “Well that's not Scrum”, does not get you any brownie points in these situations. The response could be “Well we are not doing Scrum then” which in turn leads to “We are doing Scrum…But, we have split the scrum team into units of 2/3 so that they can concentrate on a specific area of work”. While this may work, it is not Scrum and should not be called so… It is just a form of Agile. Don’t get me wrong at this stage, there is nothing wrong with Agile, just don’t call it Scrum. The reason that the Product Owner wants to do this is that, in effect, through a number of miscommunications and failings in our implementation of Scrum, there was NO unit of potentially Shippable software at the end of the first sprint. It does not matter to them that most Scrum teams will fail the first Sprint, even those that are high performing teams. Remember it is the product owners their money! We should NOT break up scrum teams into smaller units for the purpose of having less people tied up in the Scrum Ceremonies. The amount of backlog the Team selects is solely up to the Team… Only the Team can assess what it can accomplish over the upcoming Sprint. - Scrum Guide, Scrum.org The entire team must accept the work and in order to understand what they can accept they must be free to size it as a team. This both encourages common understanding and increases visibility on why team members think a task is of a particular size. This has the benefit of increasing the knowledge of the entire team in the problem domain. A new Team often first realizes that it will either sink or swim as a Team, not individually, in this meeting. The Team realizes that it must rely on itself. As it realizes this, it starts to self-organize to take on the characteristics and behaviour of a real Team. - Scrum Guide, Scrum.org This paragraph goes to the why of having the whole team at the meeting; The goal of Scrum it to produce a unit of potentially shippable software at the end of every Sprint. In order to achieve this we need high performing teams and this is what Scrum as a framework has been optimised to produce. I think that our Product Owner is understandably upset over loosing two weeks work and is losing sight the end goal of Scrum in the failures of the moment. As the man spending the money, I completely understand his perspective and I think that we should not have started Scrum on an internal project, but selected a customer  that is open to the ideas and complications of Scrum. So, what should we have NOT done on our first Scrum project: Should not have had 3 interns as the only on site resource – This lead to bad practices as the experienced guys were not there helping and correcting as they usually would. Should not have had the only experienced guys offsite – With both the experienced technical guys in completely different time zones it was difficult to get time for questions. Helping the guys on site was just plain impossible. Should not have used a part time ScrumMaster – Although the ScrumMaster attended all of the Ceremonies, because they are only in 2 full days of the week it makes it difficult for the team to raise impediments as they go. Should not have used a proxy product owner. – This was probably the worst decision that was made. Mainly because the proxy product owner did not have the same vision as the product owner. While Scrum does not explicitly reject the idea of a Proxy Product Owner, I do not think it works very well in practice. The “single wringable neck” needs to contain both the Money and the Vision as well as attending the required meetings. I will be brining all of these things up at the Sprint Retrospective and we will learn from our mistakes and move on. Do, Inspect then Adapt…   Technorati Tags: Scrum,Sprint Planing,Sprint Retrospective,Scrum.org,Scrum Guide,Scrum Ceremonies,Scrummaster,Product Owner Need Help? Professional Scrum Developer Training SSW has six Professional Scrum Developer Trainers who specialise in training your developers in implementing Scrum with Microsoft's Visual Studio ALM tools.

    Read the article

  • 3rd Party Tools: dbForge Studio for SQL Server

    - by Greg Low
    I've been taking a look at some of the 3rd party tools for SQL Server. Today, I looked at DBForge Studio for SQL Server from the team at DevArt. Installation was smooth. I did find it odd that it defaults to SQL authentication, not to Windows but either works fine. I like the way they have followed the SQL Server Management Studio visual layout. That will make the product familiar to existing SQL Server Management Studio users. I was keen to see what the database diagram tools are like. I found that the layouts generated where quite good, and certainly superior to the built-in SQL Server ones in SSMS. I didn't find any easy way to just add all tables to the diagram though. (That might just be me). One thing I did like was that it doesn't get confused when you have role playing dimensions. Multiple foreign key relationships between two tables display sensibly, unlike with the standard SQL Server version. It was pleasing to see a printing option in the diagramming tool. I found the database comparison tool worked quite well. There are a few UI things that surprised me (like when you add a new connection to a database, it doesn't select the one you just added by default) but generally it just worked as advertised, and the code that was generated looked ok. I used the SQL query editor and found the code formatting to be quite fast and while I didn't mind the style that it used by default, it wasn't obvious to me how to change the format. In Tools/Options I found things that talked about Profiles but I wasn't sure if that's what I needed. The help file pointed me in the right direction and I created a new profile. It's a bit odd that when you create a new profile, that it doesn't put you straight into editing the profile. At first I didn't know what I'd done. But as soon as I chose to edit it, I found that a very good range of options were available. When entering SQL code, the code completion options are quick but even though they are quite complete, one of the real challenges is in making them useful. Note in the following that while the options shown are correct, none are actually helpful: The Query Profiler seemed to work quite well. I keep wondering when the version supplied with SQL Server will ever have options like finding the most expensive operators, etc. Now that it's deprecated, perhaps never but it's great to see the third party options like this one and like SQL Sentry's Plan Explorer having this functionality. I didn't do much with the reporting options as I use SQL Server Reporting Services. Overall, I was quite impressed with this product and given they have a free trial available, I think it's worth your time taking a look at it.

    Read the article

  • cannot open ubuntu software center

    - by success
    I deleted some unnecessary icon themes and now my application icons are changed. I cannot open Ubuntu software center also.... the following message is displayed.... success@user-pc:~$ software-center 2012-09-12 22:24:52,048 - softwarecenter.ui.gtk3.app - INFO - setting up proxy 'None' 2012-09-12 22:24:52,055 - softwarecenter.db.database - INFO - open() database: path=None use_axi=True use_agent=True Traceback (most recent call last): File "/usr/bin/software-center", line 142, in <module> app = SoftwareCenterAppGtk3(datadir, xapian_base_path, options, args) File "/usr/share/software-center/softwarecenter/ui/gtk3/app.py", line 387, in __init__ self.datadir) File "/usr/share/software-center/softwarecenter/ui/gtk3/panes/historypane.py", line 78, in __init__ self._get_emblems(self.icons) File "/usr/share/software-center/softwarecenter/ui/gtk3/panes/historypane.py", line 192, in _get_emblems pb = icons.load_icon(emblem, self.ICON_SIZE, 0) File "/usr/lib/python2.7/dist-packages/gi/types.py", line 43, in function return info.invoke(*args, **kwargs) gi._glib.GError: Icon 'package-install' not present in theme I also tried the following code to change the icon but no didnt work.... gksu gedit /usr/share/applications/ubuntu-software-center.desktop

    Read the article

  • Announcement: Oracle Solaris 11.1

    - by uwes
    On October 3rd at Oracle OpenWorld John Fowler announced Oracle Solaris 11.1 . This first update to Oracle Solaris 11 increases uptime for the Oracle Database: 8x faster database shutdown and start-up Helps DBAs find and resolve I/O issues increasing performance 1.2x Oracle RAC throughput Oracle Solaris 11.1 drives up network utilization by extending network virtualization to include Edge Virtual Bridging and Data Center Bridging that help manage network bandwidth for high priority services and applications. Learn more and share these valuable tools with your customers to encourage them to deploy Oracle Solaris 11.1 Read Press Release here Oracle Solaris 11.1 Data Sheet (PDF) What's New in Solaris 11.1 Oracle Solaris 11.1 FAQs Join the the online web event Oracle Solaris 11 Innovations for your Data Center on November 7, 2012

    Read the article

  • data maintenance/migrations in image based sytems

    - by User
    Web applications usually have a database. The code and the database work hand in hand together. Therefore Frameworks like Ruby on Rails and Django create migration files Sure there are also servers written in Self or Smalltalk or other image-based systems that face the same problem: Code is not written on the server but in a separate image of the programmer. How do these systems deal with a changing schema, changing classes/prototypes. Which way do the migrations go? Example: What is the process of a new attribute going from programmer's idea to the server code and all objects? I found the Gemstone/S manual chapter 8 but it does not really talk about the process of shipping code to the server.

    Read the article

  • SUNsetting of legacy support apps to My Oracle Support

    - by chris.warticki
    Prepare for the upcoming retirement of Member Support Center, SunSolve and others.  December 10-12 will be the migration weekend to My Oracle Support.  The number one call to action to ensure continuous support is to register for My Oracle Support today.  There are still many opportunities to attend one of the remaining live sessions that the Customer Support Education team is leading.  Please join the discussion on the Support Training Community or Using My Oracle Support Community.   Register for any of the 80+ Global Sessions for Customers Welcome—SUN Customers and Partners Transition to My Oracle Support FAQ for Legacy Sun Customers   Escalation Instructions Network of Oracle Resources -Chris Warticki twittering @cwarticki Join one of the Twibes - http://twibes.com/MyOracleSupport or http://twibes.com/OracleSupport

    Read the article

  • How to implement Cache in web apps?

    - by Jhonnytunes
    This is really two questions. Im doing a project for the university for storing baseball players statitics, but from baseball data I have to calculate the score by year for the player who is beign displayed. The background is, lets say 10, 000 users hit the player "Alex Rodriguez", the application have to calculate 10, 000 the A-Rod stats by years intead of just read it from some where is temporal saved. Here I go: What is the best method for caching this type of data? Do I have to used the same database, and some temporal values on the same database, or create a Web Service for that? What reading about web caching so you recommend?

    Read the article

  • Certify August Updates

    - by Sadia2
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 We have added some release and platform certifications to MOS Certify. Applications : Oracle Demantra Demand Management 7.3.1.5, Oracle Demantra Predictive Trade Planning 7.3.1.5, Oracle Demantra Sales and Operations Planning 7.3.1.5 Database: Oracle Database Client 12.1.0.1.0 11.2.0.4.0, Oracle Clusterware 11.2.0.4.0, Oracle Database 11.2.0.4.0, Oracle Real Application Clusters 11.2.0.4.0 E-Business Suite: Oracle E-Business Suite 12.1.3, Oracle E-Business Suite 12.1.2, Oracle E-Business Suite 12.1.1, Oracle E-Business Suite 12.0.6, Oracle E-Business Suite 11.5.10.2 Edge Applications: Oracle Transportation Management 6.3.2 Enterprise Manager: Oracle Application Management Pack for Oracle E-Business Suite 12.1.0.1.0 Fusion Middleware: Discoverer Administrator 11.1.1.6.0, Discoverer Desktop 11.1.1.6.0, Forms Builder 11.1.1.6.0, Oracle Application Development Framework 11.1.1.6.0, Oracle Application Development Runtime 11.1.1.6.0, Oracle Business Intelligence Publisher 11.1.1.6.0, Oracle Directory Services Manager 11.1.1.6.0, Oracle Forms 11.1.1.6.0, Oracle GoldenGate 11.1.1.1.0, 11.1.1.1.2, 11.1.1.1.1, Oracle GoldenGate Application Adapters 11.1.1.1.1, Oracle Identity and Access Management 11.1.2.0.0, 11.1.2.1.0, Oracle Identity Federation 11.1.1.6.0, Oracle Real-Time Decision Load Generator 11.1.1.7.0, Oracle Real-Time Decision Studio 11.1.1.7.0, Oracle Real-Time Decisions 11.1.1.6.0, Oracle Reports 11.1.1.6.0, Oracle Segmentation Server 11.1.1.6.0, Oracle Virtual Directory 11.1.1.6.0, Oracle Web Cache 11.1.1.6.0, Oracle WebCenter Content Imaging 11.1.1.8.0, Oracle WebCenter Content Inbound Refinery Server 11.1.1.8.0, Oracle WebCenter Content Records 11.1.1.8.0, Oracle WebCenter Content Rights 11.1.1.8.0, Oracle WebCenter Content UI 11.1.1.8.0, Oracle WebCenter Enterprise Capture 11.1.1.8.0, Oracle WebCenter Portal 11.1.1.8.0, Oracle WebCenter Sites 11.1.1.8.0, Oracle WebCenter Sites: CIP for EMC Documentum 11.1.1.8.0, Oracle WebCenter Sites: CIP for File Systems and MS SharePoint 11.1.1.8.0, Oracle WebCenter Sites: Community-Gadgets 11.1.1.8.0, Oracle WebCenter Sites: Explorer 11.1.1.8.0, Oracle WebCenter Universal Content Management 11.1.1.8.0, Reports Builder 11.1.1.6.0, Oracle WebCenter Content Records 11.1.1.8.0, Oracle WebCenter Content Rights 11.1.1.8.0, Oracle WebCenter Content UI 11.1.1.8.0, Oracle WebCenter Sites: Developer Tools 11.1.1.8.0 FSGBU Insurance Group : Oracle Health Insurance Claims 2.13.3.0.0, 2.13.2.0.0, 2.13.1.0.0 JD Edwards EnterpriseOne: JD Edwards EnterpriseOne Tools 9.1.3.0, 9.1.2.0, 9.1.0.0 JD Edwards World: JD Edwards World Service Enablement A93SE, A931SE PeopleSoft: PeopleSoft PeopleTools 8.52 Siebel Enterprise: Siebel Application Server 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel CRM Desktop Client 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Database Server 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel HI Web Client 8.2.2.2.0, 8.1.1.9.0, Siebel Gateway Server 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Outlook Add-in Client 8.2.2.2.0, Siebel Remote Client 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Tools Client 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Web Server Extension 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Strengthening code with possibly useless exception handling

    - by rdurand
    Is it a good practice to implement useless exception handling, just in case another part of the code is not coded correctly? Basic example A simple one, so I don't loose everybody :). Let's say I'm writing an app that will display a person's information (name, address, etc.), the data being extracted from a database. Let's say I'm the one coding the UI part, and someone else is writing the DB query code. Now imagine that the specifications of your app say that if the person's information is incomplete (let's say, the name is missing in the database), the person coding the query should handle this by returning "NA" for the missing field. What if the query is poorly coded and doesn't handle this case? What if the guy who wrote the query handles you an incomplete result, and when you try to display the informations, everything crashes, because your code isn't prepared to display empty stuff? This example is very basic. I believe most of you will say "it's not your problem, you're not responsible for this crash". But, it's still your part of the code which is crashing. Another example Let's say now I'm the one writing the query. The specifications don't say the same as above, but that the guy writing the "insert" query should make sure all the fields are complete when adding a person to the database to avoid inserting incomplete information. Should I protect my "select" query to make sure I give the UI guy complete informations? The questions What if the specifications don't explicitly say "this guy is the one in charge of handling this situation"? What if a third person implements another query (similar to the first one, but on another DB) and uses your UI code to display it, but doesn't handle this case in his code? Should I do what's necessary to prevent a possible crash, even if I'm not the one supposed to handle the bad case? I'm not looking for an answer like "(s)he's the one responsible for the crash", as I'm not solving a conflict here, I'd like to know, should I protect my code against situations it's not my responsibility to handle? Here, a simple "if empty do something" would suffice. In general, this question tackles redundant exception handling. I'm asking it because when I work alone on a project, I may code 2-3 times a similar exception handling in successive functions, "just in case" I did something wrong and let a bad case come through.

    Read the article

  • What programming language and framework has best support for agile web development?

    - by Jonas
    If I would like to quickly set up a modern website, what programming language + framework has best support for this? E.g. short and easy to understand code for a beginner and a framework with support for modern features. Disregard my current knowledge, I'm more interested in the capacity of web programming languages and frameworks. Some requirements: RESTful URIs: http://example.com/category/id/page-title similar to the urls here on Programmers. ORM. A framework that has good database support and provide ORM or maybe a NoSQL-database. Good support for RESTful WebServices. Good support for testing and unit testing, to make sure the site is working as planned. Preferably a site that is ready to scale with an increasing number of users.

    Read the article

< Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >