Search Results

Search found 16054 results on 643 pages for 'reference architecture'.

Page 478/643 | < Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >

  • SQL SERVER – Preserve Leading Zero While Coping to Excel from SSMS

    - by pinaldave
    Earlier I wrote two articles about how to efficiently copy data from SSMS to Excel. Since I wrote that post there are plenty of interest generated on this subject. There are a few questions I keep on getting over this subject. One of the question is how to get the leading zero preserved while copying the data from SSMS to Excel. Well it is almost the same way as my earlier post SQL SERVER – Excel Losing Decimal Values When Value Pasted from SSMS ResultSet. The key here is in EXCEL and not in SQL Server. The step here is to change the format of Excel Cell to Text from Numbers and that will preserve the value of the with leading or trailing Zeros in Excel. However, I assume this is done for display purpose only because once you convert column to Text you may find it difficult to do numeric operations over the column for example Aggregation, Average etc. If you need to do the same you should either convert the columns back to Numeric in Excel or do the process in Database and export the same value as along with it as well. However, I have seen in requirement in the real world where the user has to have a numeric value with leading Zero values in it for display purpose. Here is my suggestion, instead of manipulating numeric value in the database and converting it to character value the ideal thing to do is to store it as a numeric value only in the database. Whatever changes you want to do for display purpose should be handled at the time of the display using the format function of SQL or Application Language. Honestly, database is data layer and presentation is presentation layer – they are two different things and if possible they should not be mixed. If due to any reason you cannot follow above advise and you need is to have append leading zeros in the database only here are two of my previous articles I suggest you to refer them. I am open to learn new tricks as these articles are almost three years old. Please share your opinion and suggestions in the comments area. SQL SERVER – Pad Ride Side of Number with 0 – Fixed Width Number Display SQL SERVER – UDF – Pad Ride Side of Number with 0 – Fixed Width Number Display Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Excel

    Read the article

  • Bridging the Gap in Cloud, Big Data, and Real-time

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} With all the buzz of around big data and cloud computing, it is easy to overlook one of your most precious commodities—your data. Today’s businesses cannot stand still when it comes to data. Market success now depends on speed, volume, complexity, and keeping pace with the latest data integration breakthroughs. Are you up to speed with big data, cloud integration, real-time analytics? Join us in this three part blog series where we’ll look at each component in more detail. Meet us online on October 24th where we’ll take your questions about what issues you are facing in this brave new world of integration. Let’s start first with Cloud. What happens with your data when you decide to implement a private cloud architecture? Or public cloud? Data integration solutions play a vital role migrating data simply, efficiently, and reliably to the cloud; they are a necessary ingredient of any platform as a service strategy because they support cloud deployments with data-layer application integration between on-premise and cloud environments of all kinds. For private cloud architectures, consolidation of your databases and data stores is an important step to take to be able to receive the full benefits of cloud computing. Private cloud integration requires bidirectional replication between heterogeneous systems to allow you to perform data consolidation without interrupting your business operations. In addition, integrating data requires bulk load and transformation into and out of your private cloud is a crucial step for those companies moving to private cloud. In addition, the need for managing data services as part of SOA/BPM solutions that enable agile application delivery and help build shared data services for organizations. But what about public Cloud? If you have moved your data to a public cloud application, you may also need to connect your on-premise enterprise systems and the cloud environment by moving data in bulk or as real-time transactions across geographies. For public and private cloud architectures both, Oracle offers a complete and extensible set of integration options that span not only data integration but also service and process integration, security, and management. For those companies investing in Oracle Cloud, you can move your data through Oracle SOA Suite using REST APIs to Oracle Messaging Cloud Service —a new service that lets applications deployed in Oracle Cloud securely and reliably communicate over Java Messaging Service . As an example of loading and transforming data into other public clouds, Oracle Data Integrator supports a knowledge module for Salesforce.com—now available on AppExchange. Other third-party knowledge modules are being developed by customers and partners every day. To learn more about how to leverage Oracle’s Data Integration products for Cloud, join us live: Data Integration Breakthroughs Webcast on October 24th 10 AM PST.

    Read the article

  • Oracle Products Reflect Key Trends Shaping Enterprise 2.0

    - by kellsey.ruppel(at)oracle.com
    Following up on his predictions for 2011, we asked Enterprise 2.0 veteran Andy MacMillan to map out the ways Oracle solutions are at the forefront of industry trends--and how Oracle customers can benefit in the coming year. 1. Increase organizational awareness | Oracle WebCenter Suite Oracle WebCenter Suite provides a unique set of capabilities to drive organizational awareness. In particular, the expansive activity graph connects users directly to key enterprise applications, activities, and interests. In this way, applicable and critical business information is automatically and immediately visible--in the context of key tasks--via real-time dashboards and comprehensive reporting. Oracle WebCenter Suite also integrates key E2.0 services, such as blogs, wikis, and RSS feeds, into critical business processes, including back-office systems of records such as ERP and CRM systems. 2. Drive online customer engagement | Oracle Real-Time Decisions With more and more business being conducted on the Web, driving increased online customer engagement becomes a critical key to success. This effort is usually spearheaded by an increasingly important executive role, the Head of Online, who usually reports directly to the CMO. To help manage the Web experience online, Oracle solutions are driving a new kind of intelligent social commerce by combining Oracle Universal Content Management, Oracle WebCenter Services, and Oracle Real-Time Decisions with leading e-commerce and product recommendations. Oracle Real-Time Decisions provides multichannel recommendations for content, products, and services--including seamless integration across Web, mobile, and social channels. The result: happier customers, increased customer acquisition and retention, and improved critical success metrics such as shopping cart abandonment. 3. Easily build composite applications | Oracle Application Development Framework Thanks to the shared user experience strategy across Oracle Fusion Middleware, Oracle Fusion Applications and many other Oracle Applications, customers can easily create real, customer-specific composite applications using Oracle WebCenter Suite and Oracle Application Development Framework. Oracle Application Development Framework components provide modular user interface components that can build rich, social composite applications. In addition, a broad set of components spanning BPM, SOA, ECM, and beyond can be quickly and easily incorporated into composite applications. 4. Integrate records management into a global content platform | Oracle Enterprise Content Management 11g Oracle Enterprise Content Management 11g provides leading records management capabilities as part of a unified ECM platform for managing records, documents, Web content, digital assets, enterprise imaging, and application imaging. This unique strategy provides comprehensive records management in a consistent, cost-effective way, and enables organizations to consolidate ECM repositories and connect ECM to critical business applications. 5. Achieve ECM at extreme scale | Oracle WebLogic Server and Oracle Exadata To support the high-performance demands of a unified and rationalized content platform, Oracle has pioneered highly scalable and high-performing ECM infrastructures. Two innovations in particular helped make this happen. The core ECM platform itself moved to an Enterprise Java architecture, so organizations can now use Oracle WebLogic Server for enhanced scalability and manageability. Oracle Enterprise Content Management 11g can leverage Oracle Exadata for extreme performance and scale. Likewise, Oracle Exalogic--Oracle's foundation for cloud computing--enables extreme performance for processor-intensive capabilities such as content conversion or dynamic Web page delivery. Learn more about Oracle's Enterprise 2.0 solutions.

    Read the article

  • SQL SERVER – CSVExpress and Quick Data Load

    - by pinaldave
    One of the newest ETL tools is CSVexpress.com.  This is a program that can quickly load any CSV file into ODBC compliant databases uses data integration.  For those of you familiar with databases and how they operate, the question that comes to mind might be what use this program will have in your life. I have written earlier article on this subject over here SQL SERVER – Import CSV into Database – Transferring File Content into a Database Table using CSVexpress. You might know that RDBMS have automatic support for loading CSV files into tables – but it is not quite as easy as one click of a button.  First of all, most databases have a command line interface and you need the file and configuration script in order to load up.  You also need to know enough to write the script – which for novices can be extremely daunting.  On top of all this, if you work with more than one type of RDBMS, you need to know the ins and outs of uploading and writing script for more than one program. So you might begin to see how useful CSVexpress.com might be!  There are many other tools that enable uploading files to a database.  They can be very fancy – some can generate configuration files automatically, others load the data directly.  Again, novices will be able to tell you why these aren’t the most useful programs in the world.  You see, these programs were created with SQL in mind, not for uploading data.  If you don’t have large amounts of data to upload, getting the configurations right can be a long process and you will have to check the code that is generated yourself.  Not exactly “easy to use” for novices. That makes CSVexpress.com one of the best new tools available for everyone – but especially people who don’t want to learn a lot of new material all at once.  CSVexpress has an easy to navigate graphical user interface and no scripting or coding is required.  There are built-in constraints and data validations, and you can configure transforms and reject records right there on the screen.  But the best thing of all – it’s free! That’s right, you can download CSVexpress for free from www.csvexpress.com and start easily uploading and configuring riles almost immediately.  If you’re currently happy with your method of data configuration, keep up with the good work.  For the rest of us, there’s CSVexpress.com. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • The enterprise vendor con - connecting SSD's using SATA 2 (3Gbits) thus limiting there performance

    - by tonyrogerson
    When comparing SSD against Hard drive performance it really makes me cross when folk think comparing an array of SSD running on 3GBits/sec to hard drives running on 6GBits/second is somehow valid. In a paper from DELL (http://www.dell.com/downloads/global/products/pvaul/en/PowerEdge-PowerVaultH800-CacheCade-final.pdf) on increasing database performance using the DELL PERC H800 with Solid State Drives they compare four SSD drives connected at 3Gbits/sec against ten 10Krpm drives connected at 6Gbits [Tony slaps forehead while shouting DOH!]. It is true in the case of hard drives it probably doesn’t make much difference 3Gbit or 6Gbit because SAS and SATA are both end to end protocols rather than shared bus architecture like SCSI, so the hard drive doesn’t share bandwidth and probably can’t get near the 600MiBytes/second throughput that 6Gbit gives unless you are doing contiguous reads, in my own tests on a single 15Krpm SAS disk using IOMeter (8 worker threads, queue depth of 16 with a stripe size of 64KiB, an 8KiB transfer size on a drive formatted with an allocation size of 8KiB for a 100% sequential read test) I only get 347MiBytes per second sustained throughput at an average latency of 2.87ms per IO equating to 44.5K IOps, ok, if that was 3GBits it would be less – around 280MiBytes per second, oh, but wait a minute [...fingers tap desk] You’ll struggle to find in the commodity space an SSD that doesn’t have the SATA 3 (6GBits) interface, SSD’s are fast not only low latency and high IOps but they also offer a very large sustained transfer rate, consider the OCZ Agility 3 it so happens that in my masters dissertation I did the same test but on a difference box, I got 374MiBytes per second at an average latency of 2.67ms per IO equating to 47.9K IOps – cost of an 240GB Agility 3 is £174.24 (http://www.scan.co.uk/products/240gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-500mb-s-85k-iops), but that same drive set in a box connected with SATA 2 (3Gbits) would only yield around 280MiBytes per second thus losing almost 100MiBytes per second throughput and a ton of IOps too. So why the hell are “enterprise” vendors still only connecting SSD’s at 3GBits? Well, my conspiracy states that they have no interest in you moving to SSD because they’ll lose so much money, the argument that they use SATA 2 doesn’t wash, SATA 3 has been out for some time now and all the commodity stuff you buy uses it now. Consider the cost, not in terms of price per GB but price per IOps, SSD absolutely thrash Hard Drives on that, it was true that the opposite was also true that Hard Drives thrashed SSD’s on price per GB, but is that true now, I’m not so sure – a 300GByte 2.5” 15Krpm SAS drive costs £329.76 ex VAT (http://www.scan.co.uk/products/300gb-seagate-st9300653ss-savvio-15k3-25-hdd-sas-6gb-s-15000rpm-64mb-cache-27ms) which equates to £1.09 per GB compared to a 480GB OCZ Agility 3 costing £422.10 ex VAT (http://www.scan.co.uk/products/480gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-410mb-s-30k-iops) which equates to £0.88 per GB. Ok, I compared an “enterprise” hard drive with a “commodity” SSD, ok, so things get a little more complicated here, most “enterprise” SSD’s are SLC and most commodity are MLC, SLC gives more performance and wear, I’ll talk about that another day. For now though, don’t get sucked in by vendor marketing, SATA 2 (3Gbit) just doesn’t cut it, SSD need 6Gbit to breath and even that SSD’s are pushing. Alas, SSD’s are connected using SATA so all the controllers I’ve seen thus far from HP and DELL only do SATA 2 – deliberate? Well, I’ll let you decide on that one.

    Read the article

  • SQL SERVER – How to Force New Cardinality Estimation or Old Cardinality Estimation

    - by Pinal Dave
    After reading my initial two blog posts on New Cardinality Estimation, I received quite a few questions. Once I receive this question, I felt I should have clarified it earlier few things when I started to write about cardinality. Before continuing this blog, if you have not read it before I suggest you read following two blog posts. SQL SERVER – Simple Demo of New Cardinality Estimation Features of SQL Server 2014 SQL SERVER – Cardinality Estimation and Performance – SQL in Sixty Seconds #072 Q: Does new cardinality will improve performance of all of my queries? A: Remember, there is no 0 or 1 logic when it is about estimation. The general assumption is that most of the queries will be benefited by new cardinality estimation introduced in SQL Server 2014. That is why the generic advice is to set the compatibility level of the database to 120, which is for SQL Server 2014. Q: Is it possible that after changing cardinality estimation to new logic by setting the value to compatibility level to 120, I get degraded performance for few queries? A: Yes, it is possible. However, the number of the queries where this impact should be very less. Q: Can I still run my database in older compatibility level and force few queries to newer cardinality estimation logic? If yes, How? A: Yes, you can do that. You will need to force your query with trace flag 2312 to use newer cardinality estimation logic. USE AdventureWorks2014 GO -- Old Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 110 GO -- Using New Cardinality Estimation SELECT [AddressID],[AddressLine1],[City] FROM [Person].[Address] OPTION(QUERYTRACEON 2312);; -- Using Old Cardinality Estimation SELECT [AddressID],[AddressLine1],[City] FROM [Person].[Address]; GO Q: Can I run my database in newer compatibility level and force few queries to older cardinality estimation logic? If yes, How? A: Yes, you can do that. You will need to force your query with trace flag 9481 to use newer cardinality estimation logic. USE AdventureWorks2014 GO -- NEW Cardinality Estimation ALTER DATABASE AdventureWorks2014 SET COMPATIBILITY_LEVEL = 120 GO -- Using New Cardinality Estimation SELECT [AddressID],[AddressLine1],[City] FROM [Person].[Address]; -- Using Old Cardinality Estimation SELECT [AddressID],[AddressLine1],[City] FROM [Person].[Address] OPTION(QUERYTRACEON 9481); GO I guess, I have covered most of the questions so far I have received. If I have missed any questions, please send me again and I will include the same. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQLAuthority News – Presented Soft Skill Session on Presentation Skills at SQL Bangalore on May 3, 2014

    - by Pinal Dave
    I have presented on various database technologies for almost 10 years now. SQL, Database and NoSQL have been part of my life. Earlier this month, I had the opportunity to present on the topic Performing an Effective Presentation. I must say it was blast to prepare as well as present this session. This event was part of the SQL Bangalore community. If you are in Bangalore, you must be part of this group. SQL Bangalore is a wonderful community and we always have a great response when we present on technology. It is SQL User Group and we discuss everything SQL there. This month we had SQL Server 2014 theme and we had a community launch of SQL Server. We have the best of the best speakers presenting on SQL Server 2014 technology. The event had amazing speakers and each of them did justice to the subject. You can read about this over here. In this session I told a story from my life. I talked about who inspired me and how I learned to speak in public. I told stories about two legends  who have inspired me. There is no video recording of this session. If you want to get resources from this session, please sign up my newsletter at http://bit.ly/sqllearn. Well, I had a great time at this event. We had over 250 people showed up at this event and had a grand  time together. I personally enjoyed a session of Amit Benerjee, Balmukund Lakhani and Vinod Kumar. Ken and Surabh also entertained the audience. Overall, this was a grand event and if you were in Bangalore and did not make it to this event. You did miss out on a few things. Here are a few photos of this event. SQL Bangalore UG Nupur, Chandra, Shaivi, Balmukund, Amit, Vinod [captions This] SQL Bangalore UG Audience Pinal Dave presenting at SQL UG in Bangalore Here are few of the slides from this presentation: Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL

    Read the article

  • Tablet design guide, Endeca patterns now available

    - by JuergenKress
    UX Direct, an Oracle program that offers consultants, partners, and customers the same scientifically proven and reusable user experience best practices that Oracle uses to build Oracle Applications, recently added links to a new design guide for creating tablet-based solutions for enterprise applications, and to the recently published Endeca User Interface Design Pattern Library. The tablet design guide is available from the UX Direct Home page. Tap the button under “Latest patterns & tools” for “Oracle Applications UX Tablet Guide.” It provides basic help for designers, developers, and project managers trying to approach tablet design and testing from an enterprise point of view. To hear what developers are saying about it, follow the links from this post on the User Experience Assistance blog. The newly released Endeca User Interface Design Pattern Library is also available from the UX Direct Home page and from a post on the User Experience Assistance blog. It describes principled ways to solve common user interface (UI) design problems related to search, faceted navigation, and discovery. The link between Simplified UI and Oracle UX strategy, plus content you can share on the cloud, ADf, tailoring, and more Simplified User Interface in Oracle Fusion Applications Fronts Oracle Cloud Offerings This new article on Simplified UI has just been posted on Usable Apps. Learn about the three themes - simplicity, mobility, and extensibility – that Simplified UI embodies. These same principles are guiding the development of the next generation of the Oracle user experience. Oracle's Applications User Experience Strategy: One Cloud User Experience, with Optimized UIs Where and How You Want This podcast from Misha Vaughan, Director, User Experience, is now available on the Oracle University Knowledge Center. It is available for partners and Oracle employees at this iLearning Link. Oracle Partner Builds User Experience That Hits Right Note for New Employees This new article on the Usable Apps website explores the experience of consultants at IntraSee as they implement a PeopleSoft onboarding process for Invesco, a global asset management company. The Feng Shui of Fusion This article in Oracle Scene is from Grant Ronald, Director of Product Management, on the Tools of Fusion: Oracle JDeveloper and Oracle ADF. Hands-On Workshop with Fusion Applications and ADF UX Desktop Design Patterns This post on the Voice of User Experience, or VoX, blog from Misha Vaughan describes a new kind of workshop for partners and a handful of internal Oracle sales folks on extending Oracle Fusion Applications and building custom applications with Application Development Framework (ADF) while maintaining the Oracle user experience. To learn more about the content that was delivered during this three-day workshop, visit the Usable Apps blog. Recent posts from a new blog series take a look at several of the topics discussed during the workshop. Applications User Experience Fundamentals Visual Design for any Enterprise User Interface / Art School in a Box Wireframing / Blueprinting Usable Applications Concepts. Tailoring videos This blog post from Richard Bingham, Applications Architect, on the Fusion Applications Developer Relations blog provides links to several videos that show many customization and development tasks using the Oracle Fusion Applications platform. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: UX,Architecture,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • SQL SERVER – Quiz and Video – Introduction to SQL Error Actions

    - by pinaldave
    This blog post is inspired from SQL Programming Joes 2 Pros: Programming and Development for Microsoft SQL Server 2008 – SQL Exam Prep Series 70-433 – Volume 4. [Amazon] | [Flipkart] | [Kindle] | [IndiaPlaza] This is follow up blog post of my earlier blog post on the same subject - SQL SERVER – Introduction to SQL Error Actions – A Primer. In the article we discussed various basics terminology of the error handling. The article further covers following important concepts of error handling. Introduction to SQL Error Actions Statement Termination Scope Abortion Batch Termination Above three are the most important concepts related to error handling and SQL Server.  There are many more things one has to learn but without beginners fundamentals one can’t learn the advanced concepts. Let us have small quiz and check how many of you get the fundamentals right. Quiz 1.) Which SQL Server error action happens for errors with a severity of 11-16 when you set the XACT_ABORT setting to ON? You will get Statement Termination. You will get Scope Abortion. You will get Batch Abortion. You will get Connection Termination. SQL Server will pick the error action. 2.) Which SQL Server error action happens for errors with a severity of 11-16 when you set the XACT_ABORT setting to OFF? You will get Statement Termination You will get Scope Abortion You will get Batch Abortion You will get Connection Termination SQL Server will pick the error action Now make sure that you write down all the answers on the piece of paper. Watch following video and read earlier article over here. If you want to change the answer you still have chance. Solution 1) 3 2) 5 Now compare let us check the answers and compare your answers to following answers. I am very confident you will get them correct. Available at USA: Amazon India: Flipkart | IndiaPlaza Volume: 1, 2, 3, 4, 5 Please leave your feedback in the comment area for the quiz and video. Did you know all the answers of the quiz? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Simple Explanation and Puzzle with SOUNDEX Function and DIFFERENCE Function

    - by pinaldave
    Earlier this week I asked a question where I asked how to Swap Values of the column without using CASE Statement. Read here: A Puzzle – Swap Value of Column Without Case Statement,there were more than 50 solutions proposed in the comment. There were many creative solutions. I have mentioned my personal favorite (different ones) here: Solution of Puzzle – Swap Value of Column Without Case Statement. However, I received lots of questions regarding one of the Solution by SIJIN KUMAR V P. He has used the function SOUNDEX in his solution. The request was to explain how SOUNDEX and DIFFERENCE works. Well, there are pretty decent documentations provided over here SOUNDEX function and DIFFERENCE over on MSDN and if I attempt to explain this function I will end up writing the same details which are available on MSDN. Instead of writing theory, we will try to learn this function by using a couple of simple puzzles. You try to solve the puzzles using the MSDN and see if you can learn something very quickly. In simple words - SOUNDEX converts an alphanumeric string to a four-character code to find similar-sounding words or names. The first character of the code is the first character of character_expression and the second through fourth characters of the code are numbers that represent the letters in the expression. Vowels incharacter_expression are ignored unless they are the first letter of the string. DIFFERENCE function returns an integer value. The  integer returned is the number of characters in the SOUNDEX values that are the same. The return value ranges from 0 through 4: 0 indicates weak or no similarity, and 4 indicates strong similarity or the same values. Learning Puzzle 1: Now let us run following four queries and observe its output. SELECT SOUNDEX('SQLAuthority') SdxValue SELECT SOUNDEX('SLTR') SdxValue SELECT SOUNDEX('SaLaTaRa') SdxValue SELECT SOUNDEX('SaLaTaRaM') SdxValue When you look at the result set all the four values are same. The reason for all the values to be same is as for SQL Server SOUNDEX function all the four strings are similarly sounding string. Learning Puzzle 2: Now let us run following five queries and observe its output. SELECT DIFFERENCE (SOUNDEX('SLTR'),SOUNDEX('SQLAuthority')) SELECT DIFFERENCE (SOUNDEX('TH'),SOUNDEX('SQLAuthority')) SELECT DIFFERENCE ('SQLAuthority',SOUNDEX('SQLAuthority')) SELECT DIFFERENCE ('SLTR',SOUNDEX('SQLAuthority')) SELECT DIFFERENCE ('SLTR','SQLAuthority') When you look at the result set you will get the result in the ranges from 1 to 4. Here is how it works if your result is 0 which means absolutely not relevant to each other and if your result is 1 which means the results are relevant to each other. Have you ever used above two functions in your business need or on production server? If yes, would you please leave a comment with use cases. I believe it will be beneficial to everyone. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Lenovo X220 right click does not work with ubuntu 12.04

    - by fulop
    I am unable to right click with my new X220 Lenovo sub-notebook. I have read several workaround but even not know which one would help me. Can someone help me to find the solution or workaround? dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): -D_FORTIFY_SOURCE=2 dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions -Wl,-z,relro dpkg-buildpackage: source package xserver-xorg-input-synaptics dpkg-buildpackage: source version 1.6.2-1ubuntu1~precise2 dpkg-buildpackage: source changed by Timo Aaltonen <[email protected]> dpkg-buildpackage: host architecture amd64 dpkg-source --before-build xserver-xorg-input-synaptics-1.6.2 fakeroot debian/rules clean dh clean --with quilt,autoreconf,xsf --builddirectory=build/ dh_testdir -O--builddirectory=build/ dh_auto_clean -O--builddirectory=build/ dh_quilt_unpatch -O--builddirectory=build/ Removing patch 131_reset-num_active_touches-on-deviceoff.patch Restoring src/synaptics.c Removing patch 130_dont_enable_rightbutton_area.patch Restoring conf/50-synaptics.conf Removing patch 129_disable_three_touch_tap.patch Restoring src/synaptics.c Removing patch 128_disable_three_click_action.patch Restoring src/synaptics.c Removing patch 126_ubuntu_xi22.patch Restoring configure.ac Removing patch 125_option_rec_revert.patch Restoring test/fake-symbols.h Restoring test/fake-symbols.c Removing patch 124_syndaemon_events.patch Restoring tools/syndaemon.c Removing patch 118_quell_error_msg.patch Restoring tools/synclient.c Restoring tools/syndaemon.c Removing patch 115_evdev_only.patch Restoring conf/50-synaptics.conf Removing patch 106_always_enable_vert_edge_scroll.patch Restoring src/synaptics.c Removing patch 104_always_enable_tapping.patch Restoring src/synaptics.c Removing patch 103_enable_cornertapping.patch Restoring src/synaptics.c Removing patch 101_resolution_detect_option.patch Restoring include/synaptics-properties.h Restoring man/synaptics.man Restoring src/synapticsstr.h Restoring src/properties.c Restoring src/synaptics.c Restoring tools/synclient.c Removing patch 02-do-not-use-synaptics-for-keyboards.patch Restoring conf/11-x11-synaptics.fdi No patches applied dh_autoreconf_clean -O--builddirectory=build/ dh_clean -O--builddirectory=build/ dpkg-source -b xserver-xorg-input-synaptics-1.6.2 dpkg-source: warning: no source format specified in debian/source/format, see dpkg-source(1) dpkg-source: info: using source format `1.0' dpkg-source: info: building xserver-xorg-input-synaptics using existing xserver-xorg-input-synaptics_1.6.2.orig.tar.gz dpkg-source: info: building xserver-xorg-input-synaptics in xserver-xorg-input-synaptics_1.6.2-1ubuntu1~precise2.diff.gz dpkg-source: warning: the diff modifies the following upstream files: autogen.sh docs/README.alps docs/tapndrag.dia docs/trouble-shooting.txt dpkg-source: info: use the '3.0 (quilt)' format to have separate and documented changes to upstream files, see dpkg-source(1) dpkg-source: info: building xserver-xorg-input-synaptics in xserver-xorg-input-synaptics_1.6.2-1ubuntu1~precise2.dsc debian/rules build dh build --with quilt,autoreconf,xsf --builddirectory=build/ dh_testdir -O--builddirectory=build/ dh_quilt_patch -O--builddirectory=build/ Applying patch 02-do-not-use-synaptics-for-keyboards.patch patching file conf/11-x11-synaptics.fdi Hunk #1 succeeded at 9 (offset 7 lines). Applying patch 101_resolution_detect_option.patch patching file include/synaptics-properties.h patching file man/synaptics.man patching file src/properties.c Hunk #3 succeeded at 787 (offset 6 lines). patching file src/synaptics.c Hunk #2 succeeded at 1403 (offset 3 lines). Hunk #3 succeeded at 1421 (offset 3 lines). patching file src/synapticsstr.h patching file tools/synclient.c Applying patch 103_enable_cornertapping.patch patching file src/synaptics.c Hunk #1 succeeded at 762 with fuzz 1 (offset 202 lines). Applying patch 104_always_enable_tapping.patch patching file src/synaptics.c Hunk #1 succeeded at 662 with fuzz 2 (offset 6 lines). Applying patch 106_always_enable_vert_edge_scroll.patch patching file src/synaptics.c Hunk #1 succeeded at 673 (offset 174 lines). Applying patch 115_evdev_only.patch patching file conf/50-synaptics.conf Hunk #1 succeeded at 14 with fuzz 2. Applying patch 118_quell_error_msg.patch patching file tools/synclient.c patching file tools/syndaemon.c Applying patch 124_syndaemon_events.patch patching file tools/syndaemon.c Applying patch 125_option_rec_revert.patch patching file test/fake-symbols.c patching file test/fake-symbols.h Applying patch 126_ubuntu_xi22.patch patching file configure.ac Applying patch 128_disable_three_click_action.patch patching file src/synaptics.c Hunk #1 succeeded at 671 (offset 174 lines). Applying patch 129_disable_three_touch_tap.patch patching file src/synaptics.c Hunk #1 succeeded at 665 (offset 32 lines). Applying patch 130_dont_enable_rightbutton_area.patch patching file conf/50-synaptics.conf Applying patch 131_reset-num_active_touches-on-deviceoff.patch patching file src/synaptics.c Applying patch 201-wait.patch patching file src/eventcomm.c Hunk #1 FAILED at 750. Hunk #2 FAILED at 775. Hunk #3 FAILED at 784. 3 out of 3 hunks FAILED -- rejects in file src/eventcomm.c Patch 201-wait.patch does not apply (enforce with -f) dh_quilt_patch: quilt --quiltrc /dev/null push -a || test $? = 2 returned exit code 1 make: *** [build] Error 25 dpkg-buildpackage: error: debian/rules build gave error exit status 2

    Read the article

  • Enhance Your Browsing with Hyperwords in Firefox

    - by Asian Angel
    While browsing it is easy to find information that you would like to know more about, convert, or translate. The Hyperwords extension provides access to these types of resources and more in Firefox. Using Hyperwords Once the extension has finished installing you will be presented with a demo video that will let you learn more about how Hyperwords works. For our first example we chose to look for more information concerning “WASP-12b” using Wikipedia. Notice the small bluish circle on the lower right of the highlighted term…it is the default access for the Hyperwords menu (access by hovering mouse over it). If you hover over the the Wikipedia (or other) link you can access the information in a scrollable popup window. Or if you prefer click on the link to view the information in a new tab. Choose the style that best suits your needs. Hyperwords is extremely useful for quick unit conversions. Suppose you want to share a news story that you have found while browsing. Highlight the title, access Hyperwords, and choose your preferred sharing source. You may need to authorize access for Hyperwords to post to your account. Once you have authorized access you can start sharing those links very easily. This is just a small sampling of Hyperwords many useful features. Preferences Hyperwords has a nice set of preferences available to help you customize it. Alter the menu popup style, add or remove menu entries, and modify other functions for Hyperwords. Conclusion Hyperwords makes a nice addition to Firefox for anyone needing quick access to search, reference, translation, and other services while browsing. Links Download the Hyperwords extension (Mozilla Add-ons) Download the Hyperwords extension (Extension Homepage) Similar Articles Productive Geek Tips Switch to Private Browsing Mode Easily with Toggle Private BrowsingPreview Tabs in Firefox with Tab Preview 0.3You Really Want to Completely Disable Tabs in Firefox?Quick Hits: 11 Firefox Tab How-TosWhen to Use Protect Tab vs Lock Tab in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Outlook Tools, one stop tweaking for any Outlook version Zoofs, find the most popular tweeted YouTube videos Video preview of new Windows Live Essentials 21 Cursor Packs for XP, Vista & 7 Map the Stars with Stellarium Use ILovePDF To Split and Merge PDF Files

    Read the article

  • Delegates in c#

    - by Jalpesh P. Vadgama
    I have used delegates in my programming since C# 2.0. But I have seen there are lots of confusion going on with delegates so I have decided to blog about it. In this blog I will explain about delegate basics and use of delegates in C#. What is delegate? We can say a delegate is a type safe function pointer which holds methods reference in object. As per MSDN it's a type that references to a method. So you can assign more than one methods to delegates with same parameter and same return type. Following is syntax for the delegate public delegate int Calculate(int a, int b); Here you can see the we have defined the delegate with two int parameter and integer parameter as return parameter. Now any method that matches this parameter can be assigned to above delegates. To understand the functionality of delegates let’s take a following simple example. using System; namespace Delegates { class Program { public delegate int CalculateNumber(int a, int b); static void Main(string[] args) { int a = 5; int b = 5; CalculateNumber addNumber = new CalculateNumber(AddNumber); Console.WriteLine(addNumber(5, 6)); Console.ReadLine(); } public static int AddNumber(int a, int b) { return a + b; } } } Here in the above code you can see that I have created a object of CalculateNumber delegate and I have assigned the AddNumber static method to it. Where you can see in ‘AddNumber’ static method will just return a sum of two numbers. After that I am calling method with the help of the delegates and printing out put to the console application. Now let’s run the application and following is the output as expected. That’s it. You can see the out put of delegates after adding a number. This delegates can be used in variety of scenarios. Like in web application we can use it to update one controls properties from another control’s action. Same you can also call a delegates whens some UI interaction done like button clicked. Hope you liked it. Stay tuned for more. In next post I am going to explain about multicast delegates. Till then happy programming.

    Read the article

  • SQLAuthority News – Updates on Contests, Books and SQL Server

    - by pinaldave
    There are lots of things happening on this blog and I feel sometime it is difficult to keep up. One of the suggestion I keep on receiving if there is a single page where one can visit and know the updates. I did consider of the same at some point but in era of RSS Feed it is difficult to have proper audience to that page. Here are few updates on various contest and books give away in recent time. Combo set of 5 Joes 2 Pros Book – 1 for YOU and 1 for Friend – I have received so many entries for this contest. Many have sent me email asking if this contest can be extended by couple of days. For the same the deadline for this contest is now Nov 10th 7 AM. You can send your entries by that time. The prize is 2 combo set of Joes 2 Pros is of USD 444. If you have not take part in the contest please take part now. Guess What is in the box? – There were many entry for this contest. We played this contest on blog as well, facebook. The answer of this contest was announced in 2 days in blog post announcing my new book. The winner was Manas Dash from Bangalore. He answered “The box will contain SQL book authored by Vinod and Pinal”. This was the closest answer we received. Win 5 SQL Programming Book Contest will have winner announced by Nov 15th and winners will be sent email. Win 5 SQL Wait Stats Book Contest is closed and winners have been sent their award. My third book SQL Server Interview Questions and Answers have run out of stock in India in 36 hours of its launch. We are working very hard to make it available again. Thank you again for excellent support! Without your participation all the give away have no significance. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Pinal Dave, PostADay, Readers Contribution, Readers Question, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Shard No More – An Innovative Look at Distributed Peer-to-peer SQL Database

    - by pinaldave
    There is no doubt that SQL databases play an important role in modern applications. In an ideal world, a single database can handle hundreds of incoming connections from multiple clients and scale to accommodate the related transactions. However the world is not ideal and databases are often a cause of major headaches when applications need to scale to accommodate more connections, transactions, or both. In order to overcome scaling issues, application developers often resort to administrative acrobatics, also known as database sharding. Sharding helps to improve application performance and throughput by splitting the database into two or more shards. Unfortunately, this practice also requires application developers to code transactional consistency into their applications. Getting transactional consistency across multiple SQL database shards can prove to be very difficult. Sharding requires developers to think about things like rollbacks, constraints, and referential integrity across tables within their applications when these types of concerns are best handled by the database. It also makes other common operations such as joins, searches, and memory management very difficult. In short, the very solution implemented to overcome throughput issues becomes a bottleneck in and of itself. What if database sharding was no longer required to scale your application? Let me explain. For the past several months I have been following and writing about NuoDB, a hot new SQL database technology out of Cambridge, MA. NuoDB is officially out of beta and they have recently released their first release candidate so I decided to dig into the database in a little more detail. Their architecture is very interesting and exciting because it completely eliminates the need to shard a database to achieve higher throughput. Each NuoDB database consists of at least three or more processes that enable a single database to run across multiple hosts. These processes include a Broker, a Transaction Engine and a Storage Manager.  Brokers are responsible for connecting client applications to Transaction Engines and maintain a global view of the network to keep track of the multiple Transaction Engines available at any time. Transaction Engines are in-memory processes that client applications connect to for processing SQL transactions. Storage Managers are responsible for persisting data to disk and serving up records to the Transaction Managers if they don’t exist in memory. The secret to NuoDB’s approach to solving the sharding problem is that it is a truly distributed, peer-to-peer, SQL database. Each of its processes can be deployed across multiple hosts. When client applications need to connect to a Transaction Engine, the Broker will automatically route the request to the most available process. Since multiple Transaction Engines and Storage Managers running across multiple host machines represent a single logical database, you never have to resort to sharding to get the throughput your application requires. NuoDB is a new pioneer in the SQL database world. They are making database scalability simple by eliminating the need for acrobatics such as sharding, and they are also making general administration of the database simpler as well.  Their distributed database appears to you as a user like a single SQL Server database.  With their RC1 release they have also provided a web based administrative console that they call NuoConsole. This tool makes it extremely easy to deploy and manage NuoDB processes across one or multiple hosts with the click of a mouse button. See for yourself by downloading NuoDB here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: CodeProject, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: NuoDB

    Read the article

  • SQL SERVER – A Puzzle Part 3 – Fun with SEQUENCE in SQL Server 2012 – Guess the Next Value

    - by pinaldave
    Before continuing this blog post – please read the two part of the SEQUENCE Puzzle here A Puzzle – Fun with SEQUENCE in SQL Server 2012 – Guess the Next Value and A Puzzle Part 2 – Fun with SEQUENCE in SQL Server 2012 – Guess the Next Value Where we played a simple guessing game about predicting next value. The answers the of puzzle is shared on the blog posts as a comment. Now here is the next puzzle based on yesterday’s puzzle. I recently shared the puzzle of the blog post on local user group and it was appreciated by attendees. First execute the script which I have written here. Today’s script is bit different than yesterday’s script as well it will require you to do some service related activities. I suggest you try this on your personal computer’s test environment when no one is working on it. Do not attempt this on production server as it will for sure get you in trouble. The purpose to learn how sequence behave during the unexpected shutdowns and services restarts. Now guess what will be the next value as requested in the query. USE AdventureWorks2012 GO -- Create sequence CREATE SEQUENCE dbo.SequenceID AS BIGINT START WITH 1 INCREMENT BY 1 MINVALUE 1 MAXVALUE 500 CYCLE CACHE 100; GO -- Following will return 1 SELECT next value FOR dbo.SequenceID; ------------------------------------- -- simulate server crash by restarting service -- do not attempt this on production or any server in use ------------------------------------ -- Following will return ??? SELECT next value FOR dbo.SequenceID; -- Clean up DROP SEQUENCE dbo.SequenceID; GO Once the server is restarted what will be the next value for SequenceID. We can learn interesting trivia’s about this new feature of SQL Server using this puzzle. Hint: Pay special attention to the difference between new number and earlier number. Can you see the same number in the definition of the CREATE SEQUENCE? Bonus Question: How to avoid the behavior demonstrated in above mentioned query. Does it have any effect of performance? I suggest you try to attempt to answer this question without running this code in SQL Server 2012. You can restart SQL Server using command prompt as well. I will follow up of the answer in comments below. Recently my friend Vinod Kumar wrote excellent blog post on SQL Server 2012: Using SEQUENCE, you can head over there for learning sequence in details. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL – Step by Step Guide to Download and Install NuoDB – Getting Started with NuoDB

    - by Pinal Dave
    Let us take a look at the application you own at your business. If you pay attention to the underlying database for that application you will be amazed. Every successful business these days processes way more data than they used to process before. The number of transactions and the amount of data is growing at an exponential rate. Every single day there is way more data to process than before. Big data is no longer a concept; it is now turning into reality. If you look around there are so many different big data solutions and it can be a quite difficult task to figure out where to begin. Personally, I have been experimenting with a lot of different solutions which allow my database to scale immediately without much hassle while maintaining optimal database performance.  There are for sure some solutions out there, but for many I even have to learn their specific language and there is a lot of new exploration to do. Honestly, what I prefer is a product, which works with the language I know (SQL) and follows all the RDBMS concepts which I am familiar with (ACID etc.). NuoDB is one such solution.  It is an operational NewSQL database built on a patented emergent architecture with full support for SQL and ACID guarantees. In this blog post, I will explore how one can download and install NuoDB database. Step 1: Follow me and go to the NuoDB download page. Simply fill out the form, accept the online license agreement, and you will be taken directly to a page where you can select any platform you prefer to install NuoDB. In my example below, I select the Windows 64-bit platform as it is one of the most popular NuoDB platforms. (You can also run NuoDB on Amazon Web Services but I prefer to install it on my local machine for the purposes of this blog). Step 2: Once you have downloaded the NuoDB installer, double click on it to install it on the Windows platform. Here is the enlarged the icon of the installer. Step 3: Follow the wizard installation, as it is pretty straight forward and easy to do so. I have selected all the options to install as the overall installation is very simple and it does not take up much space. I have installed it on my C drive but you can select your preferred drive. It is quite possible that if you do not have 64 bit Java, it will throw following error. If you face following error, I suggest you to download 64-bit Java from here. Make sure that you download 64-bit Java from following link: http://java.com/en/download/manual.jsp If already have Java 64-bit installed, you can continue with the installation as described in following image. Otherwise, install Java and start from with Step 1. As in my case, I already have 64-bit Java installed – and you won’t believe me when I say that the entire installation of NuoDB only took me around 90 seconds. Click on Finish to end to exit the installation. Step 4: Once the installation is successful, NuoDB will automatically open the following two tabs – Console and DevCenter — in your preferred browser. On the Console tab you can explore various components of the NuoDB solution, e.g. QuickStart, Admin, Explorer, Storefront and Samples. We will see various components and their usage in future blog posts. If you follow these steps in this post, which I have followed to install NuoDB, you will agree that the installation of NuoDB is extremely smooth and it was indeed a pleasure to install a database product with such ease. If you have installed other database products in the past, you will absolutely agree with me. So download NuoDB and install it today, and in tomorrow’s blog post I will take the installation to the next level. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • Smooth Sailing or Rough Waters: Navigating Policy Administration Modernization

    - by helen.pitts(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Life insurance and annuity carriers continue to recognize the need to modernize their aging policy administration systems, but may be hesitant to move forward because of the inherent risk involved. To help carriers better prepare for what lies ahead LOMA's Resource Magazine asked Karen Furtado, partner of Strategy Meets Action, to help them chart a course in Navigating Policy Administration Selection, the cover story of this month’s issue. The industry analyst and research firm recently asked insurance carriers to name the business drivers for replacing legacy policy administration systems. The top five cited, according to Furtado, centered on: Supporting growth in current lines Improving competitive position Containing and reducing costs Supporting growth in new lines Supporting agent demands and interaction It’s no surprise that fueling growth, both now and in the future, continues to be a key driver for modernization. Why? Inflexible, hard-coded, legacy systems require customization by IT every time a change is required. This in turn impedes a carrier’s ability to be agile, constraining their ability to quickly adapt to changing regulatory requirements and evolving market demands. It also stymies their ability to quickly bring to market new products or rapidly configure changes to existing ones, and also can inhibit how carriers service customers and distribution channels. In the article, Furtado advised carriers to ensure that the policy administration system they are considering is current and modern, with an adaptable user interface and flexible service-oriented architecture. She said carriers to should ask themselves, “How much do you need flexibility and agility now and in the future? Does it support the business processes and rules that are needed for you to be able to create that adaptable environment?” Furtado went on to advise that carriers “Connect your strategy to your business and technical capabilities before you make investment choices…You want to enable your organization to transform for the future, not just automate the past.” Unlocking High Performance with Policy Administration Transformation also was the topic of a recent LOMA webcast moderated by Ron Clark, editor of LOMA's Resource Magazine. The web cast, which featured speakers from Oracle Insurance and Capgemini, focused on how insurers can competitively drive high performance by: Replacing a legacy policy administration system with a modern, flexible platform Optimizing IT and operations costs, creating consistent processes and eliminating resource redundancies Selecting the right partner with the best blend of technology, operational, and consulting capabilities to achieve market leadership Understanding the value of outsourcing closed block operations Learn more by clicking here to access this free, one-hour recorded webcast. Helen Pitts, is senior product marketing manager for Oracle Insurance's life and annuities solutions.

    Read the article

  • InfoPath 2010 Form Design and Web Part Deployment

    - by JKenderdine
    In January I had the pleasure to speak at SharePoint Saturday Virginia Beach.  I presented a session on InfoPath 2010 forms design which included some of the basics of Forms Design, description of some of the new options with InfoPath 2010 and SharePoint 2010, and other integration possibilities.  Included below is the information presented as well as the solution to create the demo: First thing you need to understand is what the difference is between an InfoPath List form and a Form Library Form?  SharePoint List Forms:  Store data directly in a SharePoint list.  Each control (e.g. text box) in the form is bound to a column in the list. SharePoint list forms are directly connected to the list, which means that you don’t have to worry about setting up the publish and submit locations. You also do not have the option for back-end code. Form Library Forms:  Store data in XML files in a SharePoint form library.  This means they are more flexible and you can do more with them.  For example, they can be configured to save drafts and submit to different locations. However, they are more complex to work with and require more decisions to be made during configuration.  You do have the option of back-end code with these type of forms. Next steps: You need to create your File Architecture Plan.  Plan the location for the saved template – both Test and Production (This is pretty much a given, but just in case - Always make sure to have a test environment) Plan for the location of the published template Then you need to document your Form Template Design Plan.  Some questions to ask to gather your requirements: What will the form be designed to do? Will it gather user information? Will it display data from a data source? Do we need to show different views to different users? What do we base this on? How will it be implemented for the users? Browser or Client based form Site collection content type – Published through Central Admin Form Library – Published directly to form library So what are the requirements for this template?  Business Card Request Form Template Design Plan Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Show and hide fields as necessary for requirements Create multiple views – one for those submitting the form and another view for the executive assistants placing the orders. Browser based form integrated into SharePoint team site Published directly to form library The form was published through Central Administration and incorporated into the site as a content type. Utilizing the new InfoPath Web part, the form is integrated into the page and the users can complete the form directly from within that page. For now, if you are interested in the final form XSN, contact me using the Contact link above.   I will post soon with the details on how the form was created and how it integrated the requirements detailed above.

    Read the article

  • links for 2010-12-08

    - by Bob Rhubart
    What's a data architect? A comic dialog by one who knows: Oracle ACE Director Lewis Cunningham. Webcast: Oracle Business Intelligence Forum - December 15, 2010 at 9:00 am PT "The Oracle Business Intelligence Online Forum is a half-day virtual event that offers you a unique opportunity to see, in one place, the full portfolio of Oracle’s Business Intelligence (BI) offerings, and to learn what sets Oracle apart from the rest. Hear Oracle executives and industry analyst, Howard Dresner, present the current state of Business Intelligence, along with a series of customers who will share their case studies of putting analytics in action." Oracle Rolls Out Private Cloud Architecture And World-Record Transaction Performance | Forrester Blogs "Exadata has been dealt with extensively in other venues, both inside Forrester and externally, and appears to deliver the goods for I&O groups who require efficient consolidation and maximum performance from an Oracle database environment." -- Richard Fichera, Forrester Seven ways to get things started: Java EE Startup Classes with GlassFish and WebLogic "This is a blog about a topic that I realy don't like. But it comes across my ways over and over again and it's no doubt that you need it from time to time. Enough reasons for me to collect some information about it and publish it for your reference. I am talking about Startup-/Shutdown classes with Java EE applications or servers." -- Oracle ACE Director Markus "@myfear" Eisele." Monitoring Undelivered Messages in BPEL in SOA 10g (Antony Reynolds' Blog) "I am currently working with a client that wants to know how many undelivered messages they have, and if it reaches a certain threshold then they wants to alert the operator. To do this they plan on using the Enterprise Manager alert functions, but first they needs to know how many undelivered instances are out there." SOA author Antony Reynolds VirtualBox Appliances for Developers "Developers can simply download a few files, assemble them with a script , and then import and run the resulting pre-built VM in VirtualBox. This makes starting with these technologies even easier. Each appliance contains some Hands-On-Labs to start learning." -- Peter Paul van de Beek Oracle UCM 11g Remote Intradoc Client (RIDC) Integration with Oracle ADF 11g "It's great we have out of the box WebCenter ADF task flows for document management in UCM. However, for complete business scenario implementations, usually it's not enough and we need to manage Content Repository programmatically. This can be achieved through Remote Intradoc Client (RIDC) API. It's quite hard to find any practical information about this API, but I managed to get code for UCM folder creation/removal and folder information." -- Oracle ACE Director Andrejus Baranovskis Interview with Java Champion Matjaz B. Juric on Cloud Computing, SOA, and Java EE 6 "Matjaz Juric of Slovenia, head of the Cloud Computing and SOA Competence Centre at the University of Maribor, and professor at the University of Ljubljana, shares insights about cloud computing, SOA and Java EE 6." White Paper: Oracle Complex Event Processing High Availability "This whitepaper describes the high availability (HA) solutions available in Oracle CEP 11g Release 1 Patch Set 2 and  presents the results of a benchmark study demonstrating the performance of the Oracle CEP HA solutions."

    Read the article

  • Low level programming - what's in it for me?

    - by back2dos
    For years I have considered digging into what I consider "low level" languages. For me this means C and assembly. However I had no time for this yet, nor has it EVER been neccessary. Now because I don't see any neccessity arising, I feel like I should either just schedule some point in time when I will study the subject or drop the plan forever. My Position For the past 4 years I have focused on "web technologies", which may change, and I am an application developer, which is unlikely to change. In application development, I think usability is the most important thing. You write applications to be "consumed" by users. The more usable those applications are, the more value you have produced. In order to achieve good usability, I believe the following things are viable Good design: Well-thought-out features accessible through a well-thought-out user interface. Correctness: The best design isn't worth anything, if not implemented correctly. Flexibility: An application A should constantly evolve, so that its users need not switch to a different application B, that has new features, that A could implement. Applications addressing the same problem should not differ in features but in philosophy. Performance: Performance contributes to a good user experience. An application is ideally always responsive and performs its tasks reasonably fast (based on their frequency). The value of performance optimization beyond the point where it is noticeable by the user is questionable. I think low level programming is not going to help me with that, except for performance. But writing a whole app in a low level language for the sake of performance is premature optimization to me. My Question What could low level programming teach me, what other languages wouldn't teach me? Am I missing something, or is it just a skill, that is of very little use for application development? Please understand, that I am not questioning the value of C and assembly. It's just that in my everyday life, I am quite happy that all the intricacies of that world are abstracted away and managed for me (mostly by layers written in C/C++ and assembly themselves). I just don't see any concepts, that could be new to me, only details I would have to stuff my head with. So what's in it for me? My Conclusion Thanks to everyone for their answers. I must say, nobody really surprised me, but at least now I am quite sure I will drop this area of interest until any need for it arises. To my understanding, writing assembly these days for processors as they are in use in today's CPUs is not only unneccesarily complicated, but risks to result in poorer runtime performance than a C counterpart. Optimizing by hand is nearly impossible due to OOE, while you do not get all kinds of optimizations a compiler can do automatically. Also, the code is either portable, because it uses a small subset of available commands, or it is optimized, but then it probably works on one architecture only. Writing C is not nearly as neccessary anymore, as it was in the past. If I were to write an application in C, I would just as much use tested and established libraries and frameworks, that would spare me implementing string copy routines, sorting algorithms and other kind of stuff serving as exercise at university. My own code would execute faster at the cost of type safety. I am neither keen on reeinventing the wheel in the course of normal app development, nor trying to debug by looking at core dumps :D I am currently experimenting with languages and interpreters, so if there is anything I would like to publish, I suppose I'd port a working concept to C, although C++ might just as well do the trick. Again, thanks to everyone for your answers and your insight.

    Read the article

  • A few questions about integrating AudioKinetic Wwise and Unity

    - by SaldaVonSchwartz
    I'm new to Wwise and to using it with Unity, and though I have gotten the integration to work, I'm still dealing with some loose ends and have a few questions: (I'm on Unity 4.3 as of now but I think it shouldn't make any difference) The base path: The Wwise documentation implies you set this in the AkGlobalSoundEngineInitializer basePath public ivar, which is exposed to the editor. However, I found that this variable is not really used. Instead, the path is hardcoded to /Audio/GeneratedSoundBanks in AkBankPath. I had to modify both scripts to actually look in the path that I set in the editor property. What's the deal with this? Just sloppyness or am I missing something? Also about paths: since I'm on Mac, I'm using Unity natively under OS X and in tadem, the Wwise authoring tool via VMWare and I share the OS X Unity project folder so I can generate the soundbanks into the assets folder. However, the authoring tool (downloaded the latest one for Windows) doesn't automatically generate any "platform-specific" subfolders for my wwise files. That is, again, the Unity integration scripts assume the path to be /Audio/GeneratedSoundBanks/<my-platform>/ which in my case would be Mac (I set the authoring tool to generate for Mac). The documentation says wwise will automatically generate the platform-specific folders but it just dumps all the stuff in GeneratedSoundBanks. Am I missing some setting? cause right now I just manually create the /Mac folder. The C# methods AkSoundEngine.PostEvent and AkSoundEngine.LoadBank for instance, have a few overloads, including ones where I can refer to the soundbanks or events by their ID. However, if I try to use these, for instance: AkSoundEngine.LoadBank(, AkSoundEngine.AK_DEFAULT_POOL_ID) where the int I got from the .h header, I get Ak_Fail. If I use the overloads that reference the objects by string name then it works. What gives? Converting the ID header to C#: The integration comes with a C# script that seems to fork a process to call Python in turn to covert the C++ header into a C# script. This always fails unless I manually execute the Python script myself from outside Unity. Might be a permissions thing, but has anyone experienced this? The Profiler: I set up the Unity player to run in the background and am using the "Profile" version of the plugin. However, when I start the Unity OS X standalone app, the profiler in VMWare does not see it. This I'm thinking might just be that I'm trying to see a running instance of the sound engine inside an OS X binary from a Windows virtual machine. But I'm just wondering if anyone has gotten the Windows profiler to see an OS X Unity binary. Different versions of the integration plugin: It's not clear to me from the documentation whether I have to manually (or write a script to do it) remove the "Profile" version and install the "Release" version when I'm going to do a Release build or if I should install both version in Unity and it'll select the right one. Thanks!

    Read the article

  • SQL SERVER – Iridium I/O – SQL Server Deduplication that Shrinks Databases and Improves Performance

    - by Pinal Dave
    Database performance is a common problem for SQL Server DBA’s.  It seems like we spend more time on performance than just about anything else.  In many cases, we use scripts or tools that point out performance bottlenecks but we don’t have any way to fix them.  For example, what do you do when you need to speed up a query that is already tuned as well as possible?  Or what do you do when you aren’t allowed to make changes for a database supporting a purchased application? Iridium I/O for SQL Server was originally built at Confio software (makers of Ignite) because DBA’s kept asking for a way to actually fix performance instead of just pointing out performance problems. The technology is certified by Microsoft and was so promising that it was spun out into a separate company that is now run by the Confio Founder/CEO and technology management team. Iridium uses deduplication technology to both shrink the databases as well as boost IO performance.  It is intriguing to see it work.  It will deduplicate a live database as it is running transactions.  You can watch the database get smaller while user queries are running. Iridium is a simple tool to use. After installing the software, you click an “Analyze” button which will spend a minute or two on each database and estimate both your storage and performance savings.  Next, you click an “Activate” button to turn on Iridium I/O for your selected databases.  You don’t need to reboot the operating system or restart the database during any part of the process. As part of my test, I also wanted to see if there would be an impact on my databases when Iridium was removed.  The ‘revert’ process (bringing the files back to their SQL Server native format) was executed by a simple click of a button, and completed while the databases were available for normal processing. I was impressed and enjoyed playing with the software and encourage all of you to try it out.  Here is the link to the website to download Iridium for free. . Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • top Tweets SOA Partner Community &ndash; June 2012

    - by JuergenKress
    Send your tweets @soacommunity #soacommunity and follow us at http://twitter.com/soacommunity Simone Geib Contact me directly for ideas how to improve http://bit.ly/advancedsoasuite and additional posts, presentations, white papers, #soasuite SOA CommunitySOA Community Newsletter May 2012 https://soacommunity.wordpress.com /2012/05/28/soa-community-newsletter-may-2012/ #soacommunity Simone Geib #soasuite advanced OTN page has become too cluttered. Broke it into separate pages to start with. http://bit.ly/advancedsoasuite SOA CommunitySOA Management with Enterprise Manager Cloud Control 12c and Business Transaction Management 12c Demo https://soacommunity.wordpress.com /2012/05/21/soa-management-with-enterprise-manager-cloud-control-12c-and-business-transaction-management-12c-demo/ #soacommunity OracleBlogs June Webcast: SOA Gateway Implementation and Troubleshooting (2 sessions) http://ow.ly/1kbRFA OTNArchBeatEvery cloud needs an SOA lining: analyst | @JoeMcKendrick http://zd.net/KTgMHk ServiceTechSymposium New session just posted to calendar: "NoSQL for Data Services, Data Virtualization & Big Data" by Guido Schmutz, Trivadis AG ://ow.ly/bjjOe OTNArchBeat?Every cloud needs an SOA lining: analyst | @JoeMcKendrick http://zd.net/KTgMHk Debra Lilley looks good - real proof people are using the apps ! RT @fteter:Very cool Fusion Applications Help site: http://bit.ly/L3nvOR #FusionApps OTNArchBeat How to Set JVM Parameters in Oracle SOA 11G | Francis Ip http://bit.ly/JBDYPj demed"rapid proliferation of cloud computing will drive convergence of SOA and cloud paradigms" http://ovum.com/2012/05/18/soa-paves-the-way-for-cloud/ SOA Community Sending out invitations to our advanced Fusion Middleware Summer Camps! Want to learn more register for the community http://www.oracle.com/goto/emea/soa SOA Community Middleware Oracle Excellence Awards 2012 - HAPPY NEW YEAR! https://soacommunity.wordpress.com/ 2012/05/31/middleware-oracle-excellence-awards-2012 happy-new-year/ #soacommunity #opn #opnaward #specialization #oracle Simone Geib #oraclesoa performance tuning resources. All in one: docs, blogs, WPs, ppts: http://bit.ly/soa_resources OracleBlogs Middleware Oracle Excellence Awards 2012 - HAPPY NEW YEAR! http://ow.ly/1k9ri0 ServiceTechSymposiumNew session just posted to Symposium calendar: "Service Modeling & BPM Business Value Patterns" by Jürgen Kress, Oracle http://www.servicetechsymposium.com/ agenda2012.php #service_modeling_and_bpm _business_value_patterns SOA Community Happy New Year #soacommunity thanks for the business! Time for a drink ;-) http://pic.twitter.com/zkK08KWB Jan van ZoggelUsing execute-sql() function for Name-Value pair lookups in Oracle Service Bus http://wp.me/p1H430-jZ SOA Community Middleware Oracle Excellence Awards 2012&ndash;HAPPY NEW YEAR! http://wp.me/p10C8u-q4 orclateamsoa A-Team Blog #ateam: BPM 11g Deployment & Instance Migration - I have seen a number of request lately asking how to http://ow.ly/1jZ0h8 OTNArchBeat Who should ‘own’ the Enterprise Architecture? | Michael Glas http://bit.ly/K0ge0Q Oracle UPK & Tutor TOMORROW! (June 23rd) - UPK Professional Webinar at Noon ET: Discover why user adoption is a key factor for the http://bit.ly/LjZjdx Sabine Leitner Finance Event im Design-Hotel beim Barbeque: 21. Juni FRA mit Kunden SV Informatik, Schufa, LBBW http://bit.ly/JtwE3v #Oracle @itevent OracleEnterpriseMgr SOA Management with Enterprise Manager Cloud Control 12c and Business Transaction Management 12c Demo http://ow.ly/b3WP1 #em12c ServiceTechSymposium New session just posted to Symposium calendar: "Elastic SOA in the Cloud" by Steve Millidge, C2B2 Consulting http://www.servicetechsymposium.com /agenda2012.php #elastic_soa_in_the_cloud OTNArchBeat Securing Heterogeneous Systems Using Oracle Web Services Manager by @rluttikhuizen & Jens Peters http://bit.ly/KjShFi Oracleteamsoa A-Team Blog #ateam: How to Set JVM Parameters in Oracle SOA 11G http://ow.ly/1k2cnl SOA Community Oracle Service Registry in an automated (Maven) SOA/BPM build http://redstack.wordpress.com /2012/05/22/using-oracle-service-registry-in-an-automated-maven-soabpm-build/ #soacommunity #redstack #soa #osr #opn SOA CommunityHigh demand for advanced Fusion Middleware Summer Camps! Want to learn more register for the #soacommunity http://www.oracle.com/goto/emea/soa OracleBlogs? How to Set JVM Parameters in Oracle SOA 11G http://ow.ly/1k1UTv SOA Community top Tweets SOA Partner Community &ndash; May 2012 http://wp.me/p10C8u-pP ServiceTechSymposium New session just posted to Symposium calendar: "SOA Governance at EDP: A Global Energy Company" by Manuel Rosa, Link http://www.servicetechsymposium.com/ agenda2012.php #soa_governance_at_edp For regular information on Oracle SOA Suite become a member in the SOA Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Technorati Tags: soacommunity,twitter,Oracle,SOA Community,Jürgen Kress,OPN,SOA,BPM

    Read the article

  • SQL SERVER – Creating All New Database with Full Recovery Model

    - by pinaldave
    Sometimes, complex problems have very simple solutions. Let us see the following email which I received recently. “Hi Pinal, In our system when we create new database, by default, they are all created with the Simple Recovery Model. We have to manually change the recovery model after we create the database. We used the following simple T-SQL code: CREATE DATABASE dbname. We are very frustrated with this situation. We want all our databases to have the Full Recovery Model option by default. We are considering the following methods; please suggest the most efficient one among them. 1) Creating a Policy; when it is violated, the database model can be fixed 2) Triggers at Server Level 3) Automated Job which goes through all the databases and checks their recovery model; if the DBA has not changed the model, then the job will list the Databases and change their recovery model Also, we have a situation where we need a database in the Simple Recovery Model as well – how to white list them? Please suggest the best method.” Indeed, an interesting email! The answer to their question, i.e., which is the best method to fit their needs (white list, default, etc)? It will be NONE of the above. Here is the solution in one line and also the easiest way: Just go to your Model database: Path in SSMS >> Databases > System Databases >> model >> Right Click Properties >> Options >> Recovery Model - Select Full from dropdown. Every newly created database takes its base template from the Model Database. If you create a custom SP in the Model Database, when you create a new database, it will automatically exist in that database. Any database that was already created before making changes in the Model Database will not be affected at all. Creating Policy is also a good method, and I will blog about this in a separate blog post, but looking at current specifications of the reader, I think the Model Database should be modified to have a Full Recovery Option. While writing this blog post, I remembered my another blog post where the model database log file was growing drastically even though there were no transactions SQL SERVER – Log File Growing for Model Database – model Database Log File Grew Too Big. NOTE: Please do not touch the Model Database unnecessary. It is a strict “No.” If you want to create an object that you need in all the databases, then instead of creating it in model database, I suggest that you create a new database called maintenance and create the object there. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, Readers Question, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >