Search Results

Search found 67705 results on 2709 pages for 'time management'.

Page 404/2709 | < Previous Page | 400 401 402 403 404 405 406 407 408 409 410 411  | Next Page >

  • Selling Visual Studio ALM

    - by Tarun Arora
    Introduction As a consultant I have been selling Application Lifecycle Management services using Visual Studio and Team Foundation Server. I’ve been contacted various times by friends working in organization telling me that ALM processes in their company were benchmarked when dinosaurs walked the earth. Most of these individuals already know the great features Microsoft ALM tools offer and are keen to start a conversation with the CIO but don’t exactly know where to start. It is very important how you engage in your first conversation, if you start the conversation with ‘There is this great tooling from Microsoft which offers amazing features to boost developer productivity, … ‘ from experience I can tell you the reply from your CIO would be ‘I already know! Our existing landscape has a combination of bleeding edge open source and cutting edge licensed tools which already cover these features quite well, more over Microsoft products have a high licensing cost associated to them.’ You will always find it harder to sell by feature, the trick is to highlight the gap in the existing processes & tools and then highlight the impact of these gaps to the overall development processes, by now you would have captured enough attention to show off how the ALM tooling offered by Microsoft not only fills those gaps but offers great value adds to take their development practices to the next level. Rangers ALM Assessment Guide Image 1 – Welcome! First look at the Rangers ALM assessment guide Most organization already have some processes in place to cover aspects of ALM. How do you go about proving that there isn’t enough cover in place? This is where Visual Studio ALM Rangers ALM Assessment guide can help. The ALM assessment guide is really a tool that helps you gather information about Development practices and processes within a customer's environment. Several questionnaires are used to identify the current state of individual development lifecycle areas and decide on a desired state for those processes. It also presents guidance and roll-up summaries to help with recommendations moving forward. The ALM Rangers assessment guide can be downloaded from here. Image 2 – ALM Assessment guide divided into different functions of SDLC The assessment guide is divided into different functions of Software Development Lifecycle (listed below), this gives you the ability to access how mature the company is in different areas of SDLC. Architecture & Design Requirement Engineering & UX Development Software Configuration Management Governance Deployment & Operations Testing & Quality Assurance Project Planning & Management Each section has a set of questions, fill in the assessment by selecting “Never/Sometimes/Always” from the Answer column in the question sheets.  Each answer has weightage to the overall score. Each question has a link next to it, clicking the link takes you to the Reference sheet which gives you more details about the question along with a reason for “why you need to ask this question?”, “other ways to phrase the question” and “what to expect as an answer from the customer”. The trick is to engage the customer in a discussion. You need to probe a lot, listen to the customer and have a discussion with several team members, preferably without management to ensure that you receive candid feedback. This reminds me of a funny incident when during an ALM review a customer told me that they have a sophisticated semi-automated application deployment process, further discussions revealed that deployment actually involved 72 manual configuration steps per production node. Such observations can be recorded in the Issue Brainstorming worksheet for further consideration later. It is also worth mentioning the different levels of ALM maturity to the customer. By default the desired state of ALM maturity is set to Standard, it is possible to set a desired state by area, you should strive for Advanced or Dynamic, it always helps by explaining the classification and advantages. Image 3 – ALM levels by description The ALM assessment guide helps you arrive at a quantitative measure of the company’s ALM maturity. The resultant graph plotted on a spider’s web shows you the company’s current state of ALM maturity and the desired state of ALM maturity. Further since the results are classified by area you can immediately spot the areas where the customer needs immediate help. Image 4 – The spiders web! The red cross icons are areas shouting out for immediate attention, the yellow exclamation icons are areas that need improvement. These icons are calculated on the difference between the Current State of ALM maturity VS the Desired state of ALM maturity. Image 5 – Results by area Conclusion To conclude the Rangers ALM assessment guide gives you the ability to, Measure the customer’s current ALM maturity level Understand the ALM maturity level the customer desires to achieve Capture a healthy list of issues the customer wants to brainstorm further Now What’s next…? Download and get started with the Rangers ALM Assessment Guide. If you have successfully captured the above listed three pieces of information you are in a great state to make recommendations on the identified areas highlighting the benefits that Visual Studio ALM tools would offer. In the next post I will be covering how to take the ALM assessment results as the base to actually convert your recommendation into a sell.  Remember to subscribe to http://feeds.feedburner.com/TarunArora. I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. *** A special thanks goes out to fellow ranges Willy, Ethem and Philip for reviewing the blog post and providing valuable feedback. ***

    Read the article

  • Exit link tracking with timestamped logs on 3rd party content

    - by dandv
    I want to track clicks on exit links, that are placed in 3rd party content, for example on Twitter. I also need the timestamps of the clicks. Google Analytics can't be embedded in 3rd party content. Another solution is to use a URL shortener like bit.ly. However, bit.ly or goo.gl don't log the time of the click with any better granularity than a full day. su.pr shows the time for the past day in its analytics graph. The analytics download only includes the day, not the time. cli.gs was touted as having the most detailed analytics, yet it doesn't show the time either, and forces the user through a preview page. Any ideas?

    Read the article

  • Retrieve Performance Data from SOA Infrastructure Database

    - by fip
    My earlier blog posting shows how to enable, retrieve and interpret BPEL engine performance statistics to aid performance troubleshooting. The strength of BPEL engine statistics at EM is its break down per request. But there are some limitations with the BPEL performance statistics mentioned in that blog posting: The statistics were stored in memory instead of being persisted. To avoid memory overflow, the data are stored to a buffer with limited size. When the statistic entries exceed the limitation, old data will be flushed out to give ways to new statistics. Therefore it can only keep the last X number of entries of data. The statistics 5 hour ago may not be there anymore. The BPEL engine performance statistics only includes latencies. It does not provide throughputs. Fortunately, Oracle SOA Suite runs with the SOA Infrastructure database and a lot of performance data are naturally persisted there. It is at a more coarse grain than the in-memory BPEL Statistics, but it does have its own strengths as it is persisted. Here I would like offer examples of some basic SQL queries you can run against the infrastructure database of Oracle SOA Suite 11G to acquire the performance statistics for a given period of time. You can run it immediately after you modify the date range to match your actual system. 1. Asynchronous/one-way messages incoming rates The following query will show number of messages sent to one-way/async BPEL processes during a given time period, organized by process names and states select composite_name composite, state, count(*) Count from dlv_message where receive_date >= to_timestamp('2012-10-24 21:00:00','YYYY-MM-DD HH24:MI:SS') and receive_date <= to_timestamp('2012-10-24 21:59:59','YYYY-MM-DD HH24:MI:SS') group by composite_name, state order by Count; 2. Throughput of BPEL process instances The following query shows the number of synchronous and asynchronous process instances created during a given time period. It list instances of all states, including the unfinished and faulted ones. The results will include all composites cross all SOA partitions select state, count(*) Count, composite_name composite, component_name,componenttype from cube_instance where creation_date >= to_timestamp('2012-10-24 21:00:00','YYYY-MM-DD HH24:MI:SS') and creation_date <= to_timestamp('2012-10-24 21:59:59','YYYY-MM-DD HH24:MI:SS') group by composite_name, component_name, componenttype order by count(*) desc; 3. Throughput and latencies of BPEL process instances This query is augmented on the previous one, providing more comprehensive information. It gives not only throughput but also the maximum, minimum and average elapse time BPEL process instances. select composite_name Composite, component_name Process, componenttype, state, count(*) Count, trunc(Max(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) MaxTime, trunc(Min(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) MinTime, trunc(AVG(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) AvgTime from cube_instance where creation_date >= to_timestamp('2012-10-24 21:00:00','YYYY-MM-DD HH24:MI:SS') and creation_date <= to_timestamp('2012-10-24 21:59:59','YYYY-MM-DD HH24:MI:SS') group by composite_name, component_name, componenttype, state order by count(*) desc;   4. Combine all together Now let's combine all of these 3 queries together, and parameterize the start and end time stamps to make the script a bit more robust. The following script will prompt for the start and end time before querying against the database: accept startTime prompt 'Enter start time (YYYY-MM-DD HH24:MI:SS)' accept endTime prompt 'Enter end time (YYYY-MM-DD HH24:MI:SS)' Prompt "==== Rejected Messages ===="; REM 2012-10-24 21:00:00 REM 2012-10-24 21:59:59 select count(*), composite_dn from rejected_message where created_time >= to_timestamp('&&StartTime','YYYY-MM-DD HH24:MI:SS') and created_time <= to_timestamp('&&EndTime','YYYY-MM-DD HH24:MI:SS') group by composite_dn; Prompt " "; Prompt "==== Throughput of one-way/asynchronous messages ===="; select state, count(*) Count, composite_name composite from dlv_message where receive_date >= to_timestamp('&StartTime','YYYY-MM-DD HH24:MI:SS') and receive_date <= to_timestamp('&EndTime','YYYY-MM-DD HH24:MI:SS') group by composite_name, state order by Count; Prompt " "; Prompt "==== Throughput and latency of BPEL process instances ====" select state, count(*) Count, trunc(Max(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) MaxTime, trunc(Min(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) MinTime, trunc(AVG(extract(day from (modify_date-creation_date))*24*60*60 + extract(hour from (modify_date-creation_date))*60*60 + extract(minute from (modify_date-creation_date))*60 + extract(second from (modify_date-creation_date))),4) AvgTime, composite_name Composite, component_name Process, componenttype from cube_instance where creation_date >= to_timestamp('&StartTime','YYYY-MM-DD HH24:MI:SS') and creation_date <= to_timestamp('&EndTime','YYYY-MM-DD HH24:MI:SS') group by composite_name, component_name, componenttype, state order by count(*) desc;  

    Read the article

  • Smart View és az Office verziók

    - by Fekete Zoltán
    A Smart View többek között az Oracle Essbase (Hyperion) lekérdezo-elemzo-kontrolling-adatbeviteli stb felülete is. A Smart View egy MS Excel add-in-ként áll rendelkezésre. Teljes mértékben támogatja a tervezési, költségvetéskészítési, kontrolling és elemzési munkát. Az Essbase a kontrollerek szívéhez és kezéhez közelálló OLAP szerver, ami a Hyperion Planningnek is az alapja. Milyen MS Office verziókat támogat a Smart View? MS Office 2000 (XP), 2003, 2007 verziókat. Ezt az információt az Oracle Enterprise Performance Management Products - Supported Platforms Matrices helyen felsorolt dokumentumok írják le. Az Oracle Enterprise Performance Management aktuális verziójának 11.1.1.3 teljes dokumentácója megtalálható itt.

    Read the article

  • NuGet – My favorite .Net OSS project of the year 2010

    - by shiju
    NuGet is a free, open source, package management system for the .NET platform.NuGet is a member of the  the Outercurve Foundation. NuGet is very useful tool for .NET developers who are using open source libraries for their applications. NuGet enables .NET developers to easily discover, download, install and update packages into .NET projects. NuGet will also handles dependency management between libraries. Today, the .NET open source community is widely growing and providing huge set of useful libraries. Using NuGet, .NET developers can easily find and update these libraries into their .NET projects. The client-side NuPack tools provides full integration with Visual Studio 2010. You can get NuGet form its project site  http://nuget.codeplex.com. Read the Getting Started page at Codeplex to learn how to use NuGet The below screen shot shows NuGet package window for add library package reference wit hin the Visual Studio 2010.

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #003

    - by pinaldave
    Here is the list of curetted articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 This was the first year of my blogging and lots of new things I was learning as I go. I was indeed an infant in blogging a few years ago. However, as time passed by I have learned a lot. This year was year of experiments and new learning. 2007 Working as a full time DBA I often encoutered various errors and I started to learn how to avoid those error and document the same. ERROR Msg 5174 Each file size must be greater than or equal to 512 KB Whenever I see this error I wonder why someone is trying to attempt a database which is extremely small. Anyway, it does not matter what I think I keep on seeing this error often in industries. Anyway the solution of the error is equally interesting – just created larger database. Dilbert Humor This was very first encounter with database humor and I started to love it. It does not matter how many time we read this cartoon it does not get old. Generate Script with Data from Database – Database Publishing Wizard Generating schema script with data is one of the most frequently performed tasks among SQL Server Data Professionals. There are many ways to do the same. In the above article I demonstrated that how we can use the Database Publishing Wizard to accomplish the same. It was new to me at that time but I have not seen much of the adoption of the same still in the industry. Here is one of my videos where I demonstrate how we can generate data with schema. 2008 Delete Backup History – Cleanup Backup History Deleting backup history is important too but should be done carefully. If this is not carried out at regular interval there is good chance that MSDB will be filled up with all the old history. Every organization is different. Some would like to keep the history for 30 days and some for a year but there should be some limit. One should regularly archive the database backup history. South Asia MVP Open Days 2008 This was my very first year Microsoft MVP. I had Indeed big blast at the event and the fun was incredible. After this event I have attended many different MVP events but the fun and learning this particular event presented was amazing and just like me many others are not able to forget the same. Here are other links related to the event: South Asia MVP Open Day 2008 – Goa South Asia MVP Open Day 2008 – Goa – Day 1 South Asia MVP Open Day 2008 – Goa – Day 2 South Asia MVP Open Day 2008 – Goa – Day 3 2009 Enable or Disable Constraint  This is very simple script but I personally keep on forgetting it so I had blogged it. Till today, I keep on referencing this again and again as sometime a very little thing is hard to remember. Policy Based Management – Create, Evaluate and Fix Policies This article will cover the most spectacular feature of SQL 2008 – Policy-based management and how the configuration of SQL Server with policy-based management architecture can make a powerful difference. Policy based management is loaded with several advantages. It can help you implement various policies for reliable configuration of the system. It also provides additional administrative assistance to DBAs and helps them effortlessly manage various tasks of SQL Server across the enterprise. SQLPASS 2009 – My Very First SQPASS Experience Just Brilliant! I never had an experience such a thing in my life. SQL SQL and SQL – all around SQL! I am listing my own reasons here in order of importance to me. Networking with SQL fellows and experts Putting face to the name or avatar Learning and improving my SQL skills Understanding the structure of the largest SQL Server Professional Association Attending my favorite training sessions Since last time I have never missed a single time this event. This event is my favorite event and something keeps me going. Here are additional post related SQLPASS 2009. SQL PASS Summit, Seattle 2009 – Day 1 SQL PASS Summit, Seattle 2009 – Day 2 SQL PASS Summit, Seattle 2009 – Day 3 SQL PASS Summit, Seattle 2009 – Day 4 2010 Get All the Information of Database using sys.databases Even though we believe that we know everything about our database, we do not know a lot of things about our database. This little script enables us to know so many details about databases which we may not be familiar with. Run this on your server today and see how much you know your database. Reducing CXPACKET Wait Stats for High Transactional Database While engaging in a performance tuning consultation for a client, a situation occurred where they were facing a lot of CXPACKET Waits Stats. The client asked me if I could help them reduce this huge number of wait stats. I usually receive this kind of request from other client as well, but the important thing to understand is whether this question has any merits or benefits, or not. I discusses the same in this article – a bit long but insightful for sure. Error related to Database in Use There are so many database management operations in SQL Server which requires exclusive access to the database and it is not always possible to get it. When any database is online in SQL Server it either applications or system thread often accesses them. This means database can’t have exclusive access and the operations which required this access throws an error. There is very easy method to overcome this minor issue – a single line script can give you exclusive access to the database. Difference between DATETIME and DATETIME2 Developers have found the root reason of the problem when dealing with Date Functions – when data time values are converted (implicit or explicit) between different data types, which would lose some precision, so the result cannot match each other as expected. In this blog post I go over very interesting details and difference between DATETIME and DATETIME2 History of SQL Server Database Encryption I recently met Michael Coles and Rodeney Landrum the author of one of the kind book Expert SQL Server 2008 Encryption at SQLPASS in Seattle. During the conversation we ended up how Microsoft is evolving encryption technology. The same discussion lead to talking about history of encryption tools in SQL Server. Michale pointed me to page 18 of his book of encryption. He explicitly gave me permission to re-produce relevant part of history from his book. 2011 Functions FIRST_VALUE and LAST_VALUE with OVER clause and ORDER BY Some time an interesting feature and smart audience make a total difference in places. From last two days, I have been writing on SQL Server 2012 feature FIRST_VALUE and LAST_VALUE. I created a puzzle which was very interesting and got many people attempt to resolve it. It was based on following two articles: Introduction to FIRST_VALUE and LAST_VALUE Introduction to FIRST_VALUE and LAST_VALUE with OVER clause I even provided the hint about how one can solve this problem. The best part was many people solved the problem without using hints! Try your luck!  A Real Story of Book Getting ‘Out of Stock’ to A 25% Discount Story Available This is a great problem and everybody would love to have it. We had it and we loved it. Our book got out of stock in 48 hours of releasing and stocks were empty. We faced many issues and learned many valuable lessons. Some we were able to avoid in the future and some we are still facing it as those problems have no solutions. However, since that day – our books never gone out of stock. This inspiring learning story for us and I am confident that you will love to read it as well. Introduction to LEAD and LAG – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function LEAD() and LAG(). This function accesses data from a subsequent row (for lead) and previous row (for lag) in the same result set without the use of a self-join . It will be very difficult to explain this in words so I will attempt small example to explain you this function. I had a fantastic time writing this blog post and I am very confident when you read it, you will like the same. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #038

    - by Pinal Dave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2007 CASE Statement in ORDER BY Clause – ORDER BY using Variable This article is as per request from the Application Development Team Leader of my company. His team encountered code where the application was preparing string for ORDER BY clause of the SELECT statement. Application was passing this string as variable to Stored Procedure (SP) and SP was using EXEC to execute the SQL string. This is not good for performance as Stored Procedure has to recompile every time due to EXEC. sp_executesql can do the same task but still not the best performance. SSMS – View/Send Query Results to Text/Grid/Files Results to Text – CTRL + T Results to Grid – CTRL + D Results to File – CTRL + SHIFT + F 2008 Introduction to SPARSE Columns Part 2 I wrote about Introduction to SPARSE Columns Part 1. Let us understand the concept of the SPARSE column in more detail. I suggest you read the first part before continuing reading this article. All SPARSE columns are stored as one XML column in the database. Let us see some of the advantage and disadvantage of SPARSE column. Deferred Name Resolution How come when table name is incorrect SP can be created successfully but when an incorrect column is used SP cannot be created? 2009 Backup Timeline and Understanding of Database Restore Process in Full Recovery Model In general, databases backup in full recovery mode is taken in three different kinds of database files. Full Database Backup Differential Database Backup Log Backup Restore Sequence and Understanding NORECOVERY and RECOVERY While doing RESTORE Operation if you restoring database files, always use NORECOVER option as that will keep the database in a state where more backup file are restored. This will also keep database offline also to prevent any changes, which can create itegrity issues. Once all backup file is restored run RESTORE command with a RECOVERY option to get database online and operational. Four Different Ways to Find Recovery Model for Database Perhaps, the best thing about technical domain is that most of the things can be executed in more than one ways. It is always useful to know about the various methods of performing a single task. Two Methods to Retrieve List of Primary Keys and Foreign Keys of Database When Information Schema is used, we will not be able to discern between primary key and foreign key; we will have both the keys together. In the case of sys schema, we can query the data in our preferred way and can join this table to another table, which can retrieve additional data from the same. Get Last Running Query Based on SPID PID is returns sessions ID of the current user process. The acronym SPID comes from the name of its earlier version, Server Process ID. 2010 SELECT * FROM dual – Dual Equivalent Dual is a table that is created by Oracle together with data dictionary. It consists of exactly one column named “dummy”, and one record. The value of that record is X. You can check the content of the DUAL table using the following syntax. SELECT * FROM dual Identifying Statistics Used by Query Someone asked this question in my training class of query optimization and performance tuning.  “Can I know which statistics were used by my query?” 2011 SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 14 of 31 What are the basic functions for master, msdb, model, tempdb and resource databases? What is the Maximum Number of Index per Table? Explain Few of the New Features of SQL Server 2008 Management Studio Explain IntelliSense for Query Editing Explain MultiServer Query Explain Query Editor Regions Explain Object Explorer Enhancements Explain Activity Monitors SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 15 of 31 What is Service Broker? Where are SQL server Usernames and Passwords Stored in the SQL server? What is Policy Management? What is Database Mirroring? What are Sparse Columns? What does TOP Operator Do? What is CTE? What is MERGE Statement? What is Filtered Index? Which are the New Data Types Introduced in SQL SERVER 2008? SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 16 of 31 What are the Advantages of Using CTE? How can we Rewrite Sub-Queries into Simple Select Statements or with Joins? What is CLR? What are Synonyms? What is LINQ? What are Isolation Levels? What is Use of EXCEPT Clause? What is XPath? What is NOLOCK? What is the Difference between Update Lock and Exclusive Lock? SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 17 of 31 How will you Handle Error in SQL SERVER 2008? What is RAISEERROR? What is RAISEERROR? How to Rebuild the Master Database? What is the XML Datatype? What is Data Compression? What is Use of DBCC Commands? How to Copy the Tables, Schema and Views from one SQL Server to Another? How to Find Tables without Indexes? SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 18 of 31 How to Copy Data from One Table to Another Table? What is Catalog Views? What is PIVOT and UNPIVOT? What is a Filestream? What is SQLCMD? What do you mean by TABLESAMPLE? What is ROW_NUMBER()? What are Ranking Functions? What is Change Data Capture (CDC) in SQL Server 2008? SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 19 of 31 How can I Track the Changes or Identify the Latest Insert-Update-Delete from a Table? What is the CPU Pressure? How can I Get Data from a Database on Another Server? What is the Bookmark Lookup and RID Lookup? What is Difference between ROLLBACK IMMEDIATE and WITH NO_WAIT during ALTER DATABASE? What is Difference between GETDATE and SYSDATETIME in SQL Server 2008? How can I Check that whether Automatic Statistic Update is Enabled or not? How to Find Index Size for Each Index on Table? What is the Difference between Seek Predicate and Predicate? What are Basics of Policy Management? What are the Advantages of Policy Management? SQL SERVER – Interview Questions and Answers – Frequently Asked Questions – Day 20 of 31 What are Policy Management Terms? What is the ‘FILLFACTOR’? Where in MS SQL Server is ’100’ equal to ‘0’? What are Points to Remember while Using the FILLFACTOR Argument? What is a ROLLUP Clause? What are Various Limitations of the Views? What is a Covered index? When I Delete any Data from a Table, does the SQL Server reduce the size of that table? What are Wait Types? How to Stop Log File Growing too Big? If any Stored Procedure is Encrypted, then can we see its definition in Activity Monitor? 2012 Example of Width Sensitive and Width Insensitive Collation Width Sensitive Collation: A single-byte character (half-width) represented as single-byte and the same character represented as a double-byte character (full-width) are when compared are not equal the collation is width sensitive. In this example we have one table with two columns. One column has a collation of width sensitive and the second column has a collation of width insensitive. Find Column Used in Stored Procedure – Search Stored Procedure for Column Name Very interesting conversation about how to find column used in a stored procedure. There are two different characters in the story and both are having a conversation about how to find column in the stored procedure. Here are two part story Part 1 | Part 2 SQL SERVER – 2012 Functions – FORMAT() and CONCAT() – An Interesting Usage Generate Script for Schema and Data – SQL in Sixty Seconds #021 – Video In simple words, in many cases the database move from one place to another place. It is not always possible to back up and restore databases. There are possibilities when only part of the database (with schema and data) has to be moved. In this video we learn that we can easily generate script for schema for data and move from one server to another one. INFORMATION_SCHEMA.COLUMNS and Value Character Maximum Length -1 I often see the value -1 in the CHARACTER_MAXIMUM_LENGTH column of INFORMATION_SCHEMA.COLUMNS table. I understand that the length of any column can be between 0 to large number but I do not get it when I see value in negative (i.e. -1). Any insight on this subject? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • What is a “pretty and proper OO” way for handling sessions and authentication?

    - by asdfqwer
    Is coupling these two concepts a bad approach? As of right now I'm delegating all session handling and whether or not a user desires to logout in my config.inc file. As I was writing my Auth class I started wondering whether or not my Auth class should be taking care of most of the logic in my config.inc. Regardless, I'm sure there's a more elegant way of handling this... Here is what I have in my config.inc (also a large chunk of this code is based on a reply I found on SO except I can't find the source ._.): ini_set('session.name', 'SID'); # session management session_set_cookie_params(24*60*60); // set SID cookie lifetime session_start(); if(isset($_SESSION['LOGOUT']) { session_destroy(); // destroy session data $_SESSION = array(); // destroy session data sanity check setcookie('SID', '', time() - 24*60*60); // destroy session cookie data #header('Location: '.DOCROOT); } elseif(isset($_SESSION['SID_AUTH'])) { // verify user has authenticated if (!isset($_SESSION['SID_CREATED'])) { $_SESSION['SID_CREATED'] = time(); } elseif (time() - $_SESSION['SID_CREATED'] > 6*60*60) { // session started more than 6 hours ago session_regenerate_id(); // reset SID value $_SESSION['SID_CREATED'] = time(); // update creation time } if (isset($_SESSION['SID_MODIFIED']) && (time() - $_SESSION['SID_MODIFIED'] > 12*60*60)) { // last request was more than 12 hours ago session_destroy(); // destroy session data $_SESSION = array(); // destroy session data sanity check setcookie('SID', '', time() - 24*60*60); // destroy session cookie data } $_SESSION['SID_MODIFIED'] = time(); // update last activity time stamp }

    Read the article

  • Big Data – Evolution of Big Data – Day 3 of 21

    - by Pinal Dave
    In yesterday’s blog post we answered what is the Big Data. Today we will understand why and how the evolution of Big Data has happened. Though the answer is very simple, I would like to tell it in the form of a history lesson. Data in Flat File In earlier days data was stored in the flat file and there was no structure in the flat file.  If any data has to be retrieved from the flat file it was a project by itself. There was no possibility of retrieving the data efficiently and data integrity has been just a term discussed without any modeling or structure around. Database residing in the flat file had more issues than we would like to discuss in today’s world. It was more like a nightmare when there was any data processing involved in the application. Though, applications developed at that time were also not that advanced the need of the data was always there and there was always need of proper data management. Edgar F Codd and 12 Rules Edgar Frank Codd was a British computer scientist who, while working for IBM, invented the relational model for database management, the theoretical basis for relational databases. He presented 12 rules for the Relational Database and suddenly the chaotic world of the database seems to see discipline in the rules. Relational Database was a promising land for all the unstructured database users. Relational Database brought into the relationship between data as well improved the performance of the data retrieval. Database world had immediately seen a major transformation and every single vendors and database users suddenly started to adopt the relational database models. Relational Database Management Systems Since Edgar F Codd proposed 12 rules for the RBDMS there were many different vendors who started them to build applications and tools to support the relationship between database. This was indeed a learning curve for many of the developer who had never worked before with the modeling of the database. However, as time passed by pretty much everybody accepted the relationship of the database and started to evolve product which performs its best with the boundaries of the RDBMS concepts. This was the best era for the databases and it gave the world extreme experts as well as some of the best products. The Entity Relationship model was also evolved at the same time. In software engineering, an Entity–relationship model (ER model) is a data model for describing a database in an abstract way. Enormous Data Growth Well, everything was going fine with the RDBMS in the database world. As there were no major challenges the adoption of the RDBMS applications and tools was pretty much universal. There was a race at times to make the developer’s life much easier with the RDBMS management tools. Due to the extreme popularity and easy to use system pretty much every data was stored in the RDBMS system. New age applications were built and social media took the world by the storm. Every organizations was feeling pressure to provide the best experience for their users based the data they had with them. While this was all going on at the same time data was growing pretty much every organization and application. Data Warehousing The enormous data growth now presented a big challenge for the organizations who wanted to build intelligent systems based on the data and provide near real time superior user experience to their customers. Various organizations immediately start building data warehousing solutions where the data was stored and processed. The trend of the business intelligence becomes the need of everyday. Data was received from the transaction system and overnight was processed to build intelligent reports from it. Though this is a great solution it has its own set of challenges. The relational database model and data warehousing concepts are all built with keeping traditional relational database modeling in the mind and it still has many challenges when unstructured data was present. Interesting Challenge Every organization had expertise to manage structured data but the world had already changed to unstructured data. There was intelligence in the videos, photos, SMS, text, social media messages and various other data sources. All of these needed to now bring to a single platform and build a uniform system which does what businesses need. The way we do business has also been changed. There was a time when user only got the features what technology supported, however, now users ask for the feature and technology is built to support the same. The need of the real time intelligence from the fast paced data flow is now becoming a necessity. Large amount (Volume) of difference (Variety) of high speed data (Velocity) is the properties of the data. The traditional database system has limits to resolve the challenges this new kind of the data presents. Hence the need of the Big Data Science. We need innovation in how we handle and manage data. We need creative ways to capture data and present to users. Big Data is Reality! Tomorrow In tomorrow’s blog post we will try to answer discuss Basics of Big Data Architecture. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Project Euler 18: (Iron)Python

    - by Ben Griswold
    In my attempt to learn (Iron)Python out in the open, here’s my solution for Project Euler Problem 18.  As always, any feedback is welcome. # Euler 18 # http://projecteuler.net/index.php?section=problems&id=18 # By starting at the top of the triangle below and moving # to adjacent numbers on the row below, the maximum total # from top to bottom is 23. # # 3 # 7 4 # 2 4 6 # 8 5 9 3 # # That is, 3 + 7 + 4 + 9 = 23. # Find the maximum total from top to bottom of the triangle below: # 75 # 95 64 # 17 47 82 # 18 35 87 10 # 20 04 82 47 65 # 19 01 23 75 03 34 # 88 02 77 73 07 63 67 # 99 65 04 28 06 16 70 92 # 41 41 26 56 83 40 80 70 33 # 41 48 72 33 47 32 37 16 94 29 # 53 71 44 65 25 43 91 52 97 51 14 # 70 11 33 28 77 73 17 78 39 68 17 57 # 91 71 52 38 17 14 91 43 58 50 27 29 48 # 63 66 04 68 89 53 67 30 73 16 69 87 40 31 # 04 62 98 27 23 09 70 98 73 93 38 53 60 04 23 # NOTE: As there are only 16384 routes, it is possible to solve # this problem by trying every route. However, Problem 67, is the # same challenge with a triangle containing one-hundred rows; it # cannot be solved by brute force, and requires a clever method! ;o) import time start = time.time() triangle = [ [75], [95, 64], [17, 47, 82], [18, 35, 87, 10], [20, 04, 82, 47, 65], [19, 01, 23, 75, 03, 34], [88, 02, 77, 73, 07, 63, 67], [99, 65, 04, 28, 06, 16, 70, 92], [41, 41, 26, 56, 83, 40, 80, 70, 33], [41, 48, 72, 33, 47, 32, 37, 16, 94, 29], [53, 71, 44, 65, 25, 43, 91, 52, 97, 51, 14], [70, 11, 33, 28, 77, 73, 17, 78, 39, 68, 17, 57], [91, 71, 52, 38, 17, 14, 91, 43, 58, 50, 27, 29, 48], [63, 66, 04, 68, 89, 53, 67, 30, 73, 16, 69, 87, 40, 31], [04, 62, 98, 27, 23, 9, 70, 98, 73, 93, 38, 53, 60, 04, 23]] # Loop through each row of the triangle starting at the base. for a in range(len(triangle) - 1, -1, -1): for b in range(0, a): # Get the maximum value for adjacent cells in current row. # Update the cell which would be one step prior in the path # with the new total. For example, compare the first two # elements in row 15. Add the max of 04 and 62 to the first # position of row 14.This provides the max total from row 14 # to 15 starting at the first position. Continue to work up # the triangle until the maximum total emerges at the # triangle's apex. triangle [a-1][b] += max(triangle [a][b], triangle [a][b+1]) print triangle [0][0] print "Elapsed Time:", (time.time() - start) * 1000, "millisecs" a=raw_input('Press return to continue')

    Read the article

  • Updated Technical Best Practices whitepaper

    - by ACShorten
    The Technical Best Practices whitepaper has been updated with the latest advice. This edition of the whitepaper covers advice from our internal management team from the product group that manages our environments. Our product teams manage over 1500+ copies of the product, covering every version, every platform and every phase of our development, testing and production product development cycle. The technical team managing that group of environments has compiled some additional advice that has been incorporated into the Technical Best Practices and other whitepapers (inclusding Performance Troubleshooting and the Software Configuration Management Series). New advice includes new installation advice, advanced settings, new security settings and advice for both Oracle WebLogic and IBM WebSphere installations. The Technical Best Practices whitepaper is available from My Oracle Support at Doc Id: 560367.1. To assist readers of past editions of the whitepaper, new or updated advice is marked with an appropriate graphic.

    Read the article

  • Why do we keep using CSV?

    - by Stephen
    Why do we keep using CSV? I recently made a shift to working the health domain and despite the wonderful work in data transfer standards, all data transfer is in CSV, both for reporting to external organisations, and for data migrations when implementing new systems. Unfortunately the use of CSV is the cause of the endless repetition of the same stupid errors, with the same waste of developer time. (bad escaping, failing to handle null fields etc.) I know we can do better, and anything between JSON and XML (depending on the instance) would be fine. (Most of the time this is data going from one MS SQLserver 2005 to another!) I feel as if each time I see this happening I am literally watching one developer waste anothers time. So why do we keep shafting each other? When will we stop?

    Read the article

  • Eight New Oracle Database Assemblies Ready to Run In Your Oracle VM Cloud with Oracle Enterprise Manager 12c

    - by Adam Hawley
    By Sudip Datta, Senior Director, Oracle Enterprise Manager Product Management This week, 8 database virtual assemblies were released via EM 12c Self-Update. The database assemblies are already patched to Oracle recommended levels. Customers running EM 12c in online mode (i.e. connected to My Oracle Support) will see the assemblies in their EM console (screenshot below). They can then deploy the Assemblies using the Self-Service Provisioning outlined in the "Cloud Administration Guide". The EM12c agent will be deployed along with the assemblies, so the databases will be managed automatically from the onset. You can also get a general demo of the cloud management features (including assembly deployment) in http://www.oracle.com/technetwork/oem/cloud-mgmt/index.html. More database and middleware assemblies will follow soon.

    Read the article

  • WIF

    - by kaleidoscope
    Windows Identity Foundation (WIF) enables .NET developers to externalize identity logic from their application, improving developer productivity, enhancing application security, and enabling interoperability. It is a framework for implementing claims-based identity in your applications. With WIF one can create more secure applications by reducing custom implementations and using a single simplified identity model based on claims. Windows Identity Foundation is part of Microsoft's identity and access management solution built on Active Directory that also includes: · Active Directory Federation Services 2.0 (formerly known as "Geneva" Server): a security token service for IT that issues and transforms claims and other tokens, manages user access and enables federation and access management for simplified single sign-on · Windows CardSpace 2.0 (formerly known as Windows CardSpace "Geneva"): for helping users navigate access decisions and developers to build customer authentication experiences for users. Reference : http://msdn.microsoft.com/en-us/security/aa570351.aspx Geeta

    Read the article

  • Interview

    Duncan Mills, senior director of product management at Oracle, talks about the recent release of Oracle Enterprise Pack for Eclipse 11g.

    Read the article

  • Virtualization at Oracle - Six Part Series

    - by Monica Kumar
    Oracle's Matthias Pfuetzner and Detlef Drewanz have written a series of blog articles that go through virtualization technologies that can be used with the Oracle stack. I highly recommend them for anyone interested in learning about what Oracle has to offer in Server Virtualization. The series includes: Part 1: Overview Part 2:  Oracle VM Server for SPARC Part 3: Oracle VM Server for x86 Part 4: Oracle Solaris Zones and Linux Containers Part 5: Resource Management as Enabling Technology for Virtualization Part 6: Network Virtualization and Network Resource Management These articles give a good technical overview of the concepts of virtualization as well as the Oracle's server virtualization solutions spanning both SPARC and x86 architectures. Happy Reading!

    Read the article

  • New AutoVue Movies Available at the Oracle AutoVue Channel!

    - by Gerald Fauteux
    There are 4 new movies available at the Oracle AutoVue Channel. Three of these latest AutoVue movies demonstrate how AutoVue can be used in various processes, in the Electronic and High tech  sector. The fourth shows how AutoVue can be used on an iPad using Oracle Virtual Desktop Infrastructure (OVDI) They are: Improving the Design Process with AutoVue in the Electronics & High Tech Industry  Watch it now (7:17)  Improving Manufacturing and Assembly with AutoVue in the Electronics & High Tech Industry Watch it now (7:55)  Improving Supply Chain Management with AutoVue in the Electronics & High Tech Industry Watch it now (4:42)  Mobile Asset Management on the iPad With AutoVue and Oracle Virtual Desktop Infrastructure (OVDI) Watch it now (3:52)  See all the Movies available at the Oracle AutoVue Channel!

    Read the article

  • Oracle Linux 6 Implementation Essentials Certification Exam Now Available

    - by Antoinette O'Sullivan
    Get proof of your linux system administration skills by taking the Oracle Linux 6 Implementation Essentials Certification exam. This certification is available to all candidates. Oracle Partner Members earning this certification will be recognized as OPN Certified Specialists. This certification takes under 3 hours, asking you between 120-150 questions on areas including: Introduction to Oracle Linux Installing Oracle Linux 6 Linux Boot Process Oracle Linux System Configuration and Process Management Oracle Linux Package Management Ksplice Zero Downtime Updates Automate Tasks and System Logging User and Group Administration Oracle Linux File Sytems and Storage Administration Network Administration Oracle Linux System Monitoring and Troubleshooting Oracle Certifications are among the most sought after badges of credibility for expertise in the Information Technology marketplace. See Benefits of Oracle Certification for more information. To prepare for this exam, you can take the Oracle Linux System Administration training.

    Read the article

  • How do I install an older 2.6.37 Kernel Version?

    - by Seyed Mohammad
    I have a Sony VAIO P netbook and for several issues (graphics driver, audio driver and power management), I want to install an older version of the Linux kernel on Ubuntu 11.10 (actually its Xubuntu) that seems to be much more suitable. So I searched for Ubuntu kernels and found this link which includes all versions of the Linux kernel distributed by Ubuntu. I am looking for a version before 2.6.38 (to escape the known power management issue) and of course solve my many driver problems! I guess my best bet is 2.6.37 but there are several 2.6.37.x-x kernels! Can someone point me to the right choice? In each folder (for example: this one) there are several DEB packages. Which packages should I install? (Note: I have a 32-bit system) What is the installation process? sudo dpkg -i *.deb ? Is this fine or additional steps are required? Thanks.

    Read the article

  • The winning combination: Oracle VM Server for x86 + Oracle Sun Fire HW

    - by Karim Berrah
    You might be wondering why OVM Server for x86 (OVM/x86 here and below) should be seriously considered as a nice (business point of view) alternative to standard Hypervisors, if you are virtualizing Oracle Software, especially if you are planning to move to Oracle x86 Hardware (rackmount or blades). Well, let see some "not well known" facts that might interest you and help you in saving more money for your entire company (and not only the Virtulization team). Fact 1: OVM/x86 is considered as a hard partitionning technology (check page 2 of Oracle Server Partitionning Licencing Policies), so if you are buying new servers based on the latest INTEL Xeon E7 CPUs (10 cores per Socket) and have some licencing issues in deploying further Oracle SW, because you are using a hypervisor not recognized as a hard partitionning technology (like VMware), then you need to check here how to do it with OVM . This might help you to continue to deploy your Oracle DB instances on new x86 HW (12 cores, 40 cores, 64 cores servers) in a reasonable way, without having to pay licences for 12 CPU, 40 CPUs or 64 CPUs. You might also consider migrating your legacy Oracle DB DBs to a virtualized environment like OVM/x86 an recover some CPU licences, that can be reused somewhere else in production. Fact 2: OVM/x86 is free to use, without any extra licence for any specific feature (LiveMigration, High Availability, Embedded Management Console). If you want to use it on non Oracle HW, there is a support fee per  system and per year, that is much below VMware support (Oracle VM Premier Limited Support for systems up to 2 CPUs, and Oracle VM Premier Support for any bigger system, independently on the number of populated sockets). Fact 3: support is included with your Oracle x86 HW support (OPS for systems)  and you can re-install on you system Oracle Linux, Oracle Solaris or Oracle VM server for x86, without beeing charged, an keeping the same support level. Fact 4: it is less expensive to virtualize Oracle Linux or Oracle Solaris on OVM/x86 with Oracle HW that any other similar solution with VMware, because all the VMs are then supported and licenced when you buy Oracle HW with OPS. Fact 5: Oracle VM Templates bring you many Virtual Machines already installed, patched and optimized for various Oracle applications. And to be more specific, those templates are fully supported by Oracle, which is not really true when it comes to another hypervisor. By optimized VM Kernel, I mean PV drivers, OVM-ready kernels in the VM, single source clock for all the VMS, better memory management of the VM ... Fact 6: there is no extra costs for a management console. OVM comes with a free OVM Manager package for Linux.  More infos: Latest announcement of OVM/x86 update 2.2.2 A short flash demo of OVM server for x86 A short flash demo on OVM Templates and Virtual Assembly Builder Oracle Linux Support and Oracle VM Support Global Price List  ISVs: Benefits for Independant Sofwtare Vendors (ISVs) in using OVM/x86 Consultant Services: Advanced Customer Services for OVM/x86  Technical Features Best practices and Guideline for OVM with Oracle Blades Reduce TCO and get more Value from your x86 Infrastructure

    Read the article

  • Latest DSTv15 Timezone Patches Available for E-Business Suite

    - by Steven Chan
    If your E-Business Suite Release 11i or 12 environment is configured to support Daylight Saving Time (DST) or international time zones, it's important to keep your timezone definition files up-to-date. They were last changed in July 2010 and released as DSTv14. DSTv15 is now available and certified with Oracle E-Business Suite Release 11i and 12. Is Your Apps Environment Affected?When a country or region changes DST rules or their time zone definitions, your Oracle E-Business Suite environment will require patching if:Your Oracle E-Business Suite environment is located in the affected country or region ORYour Oracle E-Business Suite environment is located outside the affected country or region but you conduct business or have customers or suppliers in the affected country or region We last discussed the DSTv14 patches on this blog. The latest "DSTv15" timezone definition file is cumulative and includes all DST changes released in earlier time zone definition files. DSTv15 includes changes to the following timezones since the DSTv14 release:Africa/Cairo 2010 2010Egypt 2010 2010America/Bahia_Banderas 2010 2010Asia/Amman 2002Asia/Gaza 2010 2010Europe/Helsinki 1981 1982Pacific/Fiji 2011Pacific/Apia 2011Hongkong 1977 1977Asia/Hong_Kong 1977 1977Europe/Mariehamn 1981 1982

    Read the article

  • What's new in the RightNow November 2012 release?

    - by Richard Lefebvre
    What new in the RightNow November 2012? In order to find out, please watch this tutorial with imbedded demonstration or read the November 2012 Release notes.   News Facts The November 2012 release of     Oracle’s RightNow CX Cloud Service marks the completion of development efforts for 2012 and continues Oracle’s commitment to enhancing the Oracle RightNow offering following the acquisition. New release delivers key capabilities designed to help organizations improve customer experiences in order to increase customer acquisition and retention, while reducing total cost of ownership. Part of the Oracle Cloud, Oracle RightNow CX Cloud Service now integrates Oracle RightNow Chat Cloud Service with Oracle Engagement Engine Cloud Service, helping organizations intelligently and proactively engage with customers through the right channel at the right time. Chat solutions have emerged as an important component of a cross-channel customer experience strategy. According to Forrester Research, Inc., chat adoption has risen dramatically between 2009 and 2011 from 19% to 37%, and it has the highest satisfaction level of all customer service channels at 62% satisfaction. (*) To help companies deliver enhanced customer experiences, Oracle has made significant investments in Oracle RightNow Chat Cloud Service throughout 2012. With the addition of rules-based engagement to existing capabilities such as co-browse, mobile chat, and cross-channel knowledge integration with the contact center, all delivered via the cloud, Oracle RightNow Chat Cloud Service is differentiated as the industry-leading chat solution. The Oracle Cloud offers a broad portfolio of software as-a-service applications, including Oracle Customer Service and Support Cloud Service, which is based on the Oracle RightNow CX Cloud Service. New Capabilities Key Oracle RightNow Chat Cloud Service and other cross-channel capabilities include: Chat Business Rules, with over 70 built-in rule conditions, leverage the Oracle Engagement Engine to help enable organizations capture rich visitor data and invoke complex actions and triggers. Chat Business Rules allow granular control over when to engage a customer via the chat channel based on customer behavior, customer profile information and operational information. Click-to-Call provides the option for a customer to engage with a live agent over the phone during the Web browsing experience. Chat Availability Controls provide organizations with the ability to throttle volume through the chat channel based on real-time agent availability and wait time thresholds. This ability to manage the channel more efficiently allows organizations to provide a better experience to customers using the chat channel. Strategic and Operational Chat Channel Analytics provide better insight into channel and agent productivity and utilization and effectiveness with both out-of-the-box reports and ad hoc reports. New chat channel analytics provide comprehensive metrics with full data transparency. Background Service Updates improve high availability metrics for Oracle RightNow Chat Cloud Service during service update periods, setting the industry leading standard for sales and service delivery to customers via the chat channel. Additional Capabilities include: Improved Web developer tools for more efficient self-service user interface design Improved administration for enhanced user sessions management Increased cross-channel community collaboration Enhanced extensibility widgets and syndication management Streamlined content management and analytics capabilities Read the full announcement here

    Read the article

< Previous Page | 400 401 402 403 404 405 406 407 408 409 410 411  | Next Page >