Search Results

Search found 3505 results on 141 pages for 'chris koenig'.

Page 1/141 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • SQL SERVER – Weekend Project – Visiting Friend’s Company – Koenig Solutions

    - by pinaldave
    I have decided to do some interesting experiments every weekend and share it next week as a weekend project on the blog. Many times our business lives and personal lives are very separate, however this post will talk about one instance where my two lives connect. This weekend I visited my friend’s company. My friend owns Koenig, so of course I am very interested so see that they are doing well.  I have been very impressed this year, as they have expanded into multiple cities and are offering more and more classes about Business Intelligence, Project Management, networking, and much more. Koenig Solutions originally were a company that focused on training IT professionals – from topics like databases and even English language courses.  As the company grew more popular, Koenig began their blog to keep fans updated, and gradually have added more and more courses. I am very happy for my friend’s success, but as a technology enthusiast I am also pleased with Koenig Solutions’ success.  Whenever anyone in our field improves, the field as a whole does better.  Koenig offers high quality classes on a variety of topics at a variety of levels, so anyone can benefit from browsing this blog. I am a big fan of technology (obviously), and I feel blessed to have gotten in on the “ground floor,” even though there are some people out there who think technology has advanced as far as possible – I believe they will be proven wrong.  And that is why I think companies like Koenig Solutions are so important – they are providing training and support in a quickly growing field, and providing job skills in this tough economy. I believe this particular post really highlights how I, and Koenig, feel about the IT industry.  It is quickly expanding, and job opportunities are sure to abound – but how can the average person get started in this exciting field?  This post emphasizes that knowledge is power – know what interests you in the IT field, get an education, and continue your training even after you’ve gotten your foot in the door. Koenig Solutions provides what I feel is one of the most important services in the modern world – in person training.  They obviously offer many online courses, but you can also set up physical, face-to-face training through their website.  As I mentioned before, they offer a wide variety of classes that cater to nearly every IT skill you can think of.  If you have more specific needs, they also offer one of the best English language training courses.  English is turning into the language of technology, so these courses can ensure that you are keeping up the pace. Koenig Solutions and I agree about how important training can be, and even better – they provide some of the best training around.  We share ideals and I am very happy see the success of my friend. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, SQLAuthority Author Visit, T SQL, Technology Tagged: Developer Training

    Read the article

  • Including an embedded framework using a cross-project-reference: Header no such file or directory

    - by d11wtq
    I'm trying to create a Cocoa framework by using a cross-project reference in Xcode. I have 2 projects: one for the framework; one for the application that will use the framework. This framework is not intended to be stored within the system; it is an embedded framework that lives within the application bundle. I have successfully made the cross-project reference, marked the framework as being a dependency of my target, added a Copy Files build phase that puts the framework in Contents/Frameworks/ and added the framework to the linker phase (I checked the little "Target" checkbox; I've also done it manually by dragging the framework into the linker phase). My framework's install directory is correctly set to @executable_path/../Frameworks. However, when I try to build my app it: a) Correctly builds the framework first b) Correctly copies the framework c) Errors because it cannot find the master header file in my framework I have verified that the header is there. I can see it in the app product that is partially built. ls build/Debug/CioccolataTest.webapp/Contents/Frameworks/Cioccolata.framework/Headers/Cioccolata.h build/Debug/CioccolataTest.webapp/Contents/Frameworks/Cioccolata.framework/Headers/Cioccolata.h I have been able to successfully build the app by copying my framework into /Library/Frameworks (I can then delete it again after the successful build), but this is a workaround, I'm looking to find it out why Xcode doesn't find the framework's master header file without it being copied to a system directory. Is copying it to the app bundle during the build not sufficient? Here's the full build transcript if it's any help (it's just a Hello World app right now, so not much going on here): Build Cioccolata of project Cioccolata with configuration Debug SymLink /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/Current A cd /Users/chris/Projects/Mac/Cioccolata /bin/ln -sf A /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/Current SymLink /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Resources Versions/Current/Resources cd /Users/chris/Projects/Mac/Cioccolata /bin/ln -sf Versions/Current/Resources /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Resources SymLink /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Headers Versions/Current/Headers cd /Users/chris/Projects/Mac/Cioccolata /bin/ln -sf Versions/Current/Headers /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Headers SymLink /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Cioccolata Versions/Current/Cioccolata cd /Users/chris/Projects/Mac/Cioccolata /bin/ln -sf Versions/Current/Cioccolata /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Cioccolata ProcessInfoPlistFile /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Resources/Info.plist Info.plist cd /Users/chris/Projects/Mac/Cioccolata builtin-infoPlistUtility Info.plist -expandbuildsettings -platform macosx -o /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Resources/Info.plist CpHeader build/Debug/Cioccolata.framework/Versions/A/Headers/CWHelloWorld.h CWHelloWorld.h cd /Users/chris/Projects/Mac/Cioccolata /Developer/Library/PrivateFrameworks/DevToolsCore.framework/Resources/pbxcp -exclude .DS_Store -exclude CVS -exclude .svn -resolve-src-symlinks /Users/chris/Projects/Mac/Cioccolata/CWHelloWorld.h /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Headers CpHeader build/Debug/Cioccolata.framework/Versions/A/Headers/Cioccolata.h Cioccolata.h cd /Users/chris/Projects/Mac/Cioccolata /Developer/Library/PrivateFrameworks/DevToolsCore.framework/Resources/pbxcp -exclude .DS_Store -exclude CVS -exclude .svn -resolve-src-symlinks /Users/chris/Projects/Mac/Cioccolata/Cioccolata.h /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Headers CopyStringsFile /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Resources/English.lproj/InfoPlist.strings English.lproj/InfoPlist.strings cd /Users/chris/Projects/Mac/Cioccolata setenv ICONV /usr/bin/iconv /Developer/Library/Xcode/Plug-ins/CoreBuildTasks.xcplugin/Contents/Resources/copystrings --validate --inputencoding utf-8 --outputencoding UTF-16 English.lproj/InfoPlist.strings --outdir /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Resources/English.lproj ProcessPCH /var/folders/Xy/Xy-bvnxtFpiYBQPED0dK1++++TI/-Caches-/com.apple.Xcode.501/SharedPrecompiledHeaders/Cioccolata_Prefix-dololiigmwjzkgenggebqtpvbauu/Cioccolata_Prefix.pch.gch Cioccolata_Prefix.pch normal i386 objective-c com.apple.compilers.gcc.4_2 cd /Users/chris/Projects/Mac/Cioccolata setenv LANG en_US.US-ASCII /Developer/usr/bin/gcc-4.2 -x objective-c-header -arch i386 -fmessage-length=0 -pipe -std=gnu99 -Wno-trigraphs -fpascal-strings -fasm-blocks -O0 -Wreturn-type -Wunused-variable -isysroot /Developer/SDKs/MacOSX10.5.sdk -mfix-and-continue -mmacosx-version-min=10.5 -gdwarf-2 -iquote /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-generated-files.hmap -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-own-target-headers.hmap -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-all-target-headers.hmap -iquote /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-project-headers.hmap -F/Users/chris/Projects/Mac/Cioccolata/build/Debug -I/Users/chris/Projects/Mac/Cioccolata/build/Debug/include -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/DerivedSources/i386 -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/DerivedSources -c /Users/chris/Projects/Mac/Cioccolata/Cioccolata_Prefix.pch -o /var/folders/Xy/Xy-bvnxtFpiYBQPED0dK1++++TI/-Caches-/com.apple.Xcode.501/SharedPrecompiledHeaders/Cioccolata_Prefix-dololiigmwjzkgenggebqtpvbauu/Cioccolata_Prefix.pch.gch CompileC build/Cioccolata.build/Debug/Cioccolata.build/Objects-normal/i386/CWHelloWorld.o /Users/chris/Projects/Mac/Cioccolata/CWHelloWorld.m normal i386 objective-c com.apple.compilers.gcc.4_2 cd /Users/chris/Projects/Mac/Cioccolata setenv LANG en_US.US-ASCII /Developer/usr/bin/gcc-4.2 -x objective-c -arch i386 -fmessage-length=0 -pipe -std=gnu99 -Wno-trigraphs -fpascal-strings -fasm-blocks -O0 -Wreturn-type -Wunused-variable -isysroot /Developer/SDKs/MacOSX10.5.sdk -mfix-and-continue -mmacosx-version-min=10.5 -gdwarf-2 -iquote /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-generated-files.hmap -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-own-target-headers.hmap -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-all-target-headers.hmap -iquote /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Cioccolata-project-headers.hmap -F/Users/chris/Projects/Mac/Cioccolata/build/Debug -I/Users/chris/Projects/Mac/Cioccolata/build/Debug/include -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/DerivedSources/i386 -I/Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/DerivedSources -include /var/folders/Xy/Xy-bvnxtFpiYBQPED0dK1++++TI/-Caches-/com.apple.Xcode.501/SharedPrecompiledHeaders/Cioccolata_Prefix-dololiigmwjzkgenggebqtpvbauu/Cioccolata_Prefix.pch -c /Users/chris/Projects/Mac/Cioccolata/CWHelloWorld.m -o /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Objects-normal/i386/CWHelloWorld.o Ld /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Cioccolata normal i386 cd /Users/chris/Projects/Mac/Cioccolata setenv MACOSX_DEPLOYMENT_TARGET 10.5 /Developer/usr/bin/gcc-4.2 -arch i386 -dynamiclib -isysroot /Developer/SDKs/MacOSX10.5.sdk -L/Users/chris/Projects/Mac/Cioccolata/build/Debug -F/Users/chris/Projects/Mac/Cioccolata/build/Debug -filelist /Users/chris/Projects/Mac/Cioccolata/build/Cioccolata.build/Debug/Cioccolata.build/Objects-normal/i386/Cioccolata.LinkFileList -install_name @executable_path/../Frameworks/Cioccolata.framework/Versions/A/Cioccolata -mmacosx-version-min=10.5 -framework Foundation -single_module -compatibility_version 1 -current_version 1 -o /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework/Versions/A/Cioccolata Touch /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework cd /Users/chris/Projects/Mac/Cioccolata /usr/bin/touch -c /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework Build CioccolataTest of project CioccolataTest with configuration Debug ProcessInfoPlistFile /Users/chris/Projects/Mac/CioccolataTest/build/Debug/CioccolataTest.webapp/Contents/Info.plist Info.plist cd /Users/chris/Projects/Mac/CioccolataTest builtin-infoPlistUtility Info.plist -expandbuildsettings -platform macosx -o /Users/chris/Projects/Mac/CioccolataTest/build/Debug/CioccolataTest.webapp/Contents/Info.plist PBXCp build/Debug/CioccolataTest.webapp/Contents/Frameworks/Cioccolata.framework /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework cd /Users/chris/Projects/Mac/CioccolataTest /Developer/Library/PrivateFrameworks/DevToolsCore.framework/Resources/pbxcp -exclude .DS_Store -exclude CVS -exclude .svn -resolve-src-symlinks /Users/chris/Projects/Mac/Cioccolata/build/Debug/Cioccolata.framework /Users/chris/Projects/Mac/CioccolataTest/build/Debug/CioccolataTest.webapp/Contents/Frameworks CopyStringsFile /Users/chris/Projects/Mac/CioccolataTest/build/Debug/CioccolataTest.webapp/Contents/Resources/English.lproj/InfoPlist.strings English.lproj/InfoPlist.strings cd /Users/chris/Projects/Mac/CioccolataTest setenv ICONV /usr/bin/iconv /Developer/Library/Xcode/Plug-ins/CoreBuildTasks.xcplugin/Contents/Resources/copystrings --validate --inputencoding utf-8 --outputencoding UTF-16 English.lproj/InfoPlist.strings --outdir /Users/chris/Projects/Mac/CioccolataTest/build/Debug/CioccolataTest.webapp/Contents/Resources/English.lproj CompileC build/CioccolataTest.build/Debug/CioccolataTest.build/Objects-normal/i386/main.o main.m normal i386 objective-c com.apple.compilers.gcc.4_2 cd /Users/chris/Projects/Mac/CioccolataTest setenv LANG en_US.US-ASCII /Developer/usr/bin/gcc-4.2 -x objective-c -arch i386 -fmessage-length=0 -pipe -std=gnu99 -Wno-trigraphs -fpascal-strings -fasm-blocks -O0 -Wreturn-type -Wunused-variable -isysroot /Developer/SDKs/MacOSX10.5.sdk -mfix-and-continue -mmacosx-version-min=10.5 -gdwarf-2 -iquote /Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/CioccolataTest-generated-files.hmap -I/Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/CioccolataTest-own-target-headers.hmap -I/Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/CioccolataTest-all-target-headers.hmap -iquote /Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/CioccolataTest-project-headers.hmap -F/Users/chris/Projects/Mac/CioccolataTest/build/Debug -I/Users/chris/Projects/Mac/CioccolataTest/build/Debug/include -I/Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/DerivedSources/i386 -I/Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/DerivedSources -include /Users/chris/Projects/Mac/CioccolataTest/prefix.pch -c /Users/chris/Projects/Mac/CioccolataTest/main.m -o /Users/chris/Projects/Mac/CioccolataTest/build/CioccolataTest.build/Debug/CioccolataTest.build/Objects-normal/i386/main.o In file included from <command-line>:0: /Users/chris/Projects/Mac/CioccolataTest/prefix.pch:13:35: error: Cioccolata/Cioccolata.h: No such file or directory /Users/chris/Projects/Mac/CioccolataTest/main.m: In function 'main': /Users/chris/Projects/Mac/CioccolataTest/main.m:13: error: 'CWHelloWorld' undeclared (first use in this function) /Users/chris/Projects/Mac/CioccolataTest/main.m:13: error: (Each undeclared identifier is reported only once /Users/chris/Projects/Mac/CioccolataTest/main.m:13: error: for each function it appears in.) /Users/chris/Projects/Mac/CioccolataTest/main.m:13: error: 'hello' undeclared (first use in this function)

    Read the article

  • Teeing Off With Chris Leone at OpenWorld 2012

    - by Kathryn Perry
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} A guest post by Chris Leone, Senior Vice President, Oracle Applications Development Monday morning in downtown San Francisco - lots of sunshine, plenty of traffic, and sidewalks chocked full of people with fresh faces and blister free feet. Let the week of Oracle OpenWorld begin! For a great Applications start, Chris Leone packed the house with his Fusion Applications overview session - he covered strategy, scope, roadmaps, and customer successes. Fusion Apps, the world's best SaaS suite, is built on 100 percent standards. Chris talked about its information driven user experience, its innovative design, and the choice of deployment. People can run Fusion in the cloud, in a managed / hosted environment, or on premise -- or they can use a combination of these three models. About seventy percent of our customers go with SaaS. Release 5 of Fusion Apps will become available soon. The cadence of releases will be three times a year. The key drivers are to accelerate business success (no rip and replace) and to simplify business processes. Chris told the audience that organic Fusion is the centerpiece of our cloud solutions, rounded out with acquired offerings such as Taleo Recruiting and RightNow Customer Service. From the cloud solutions, customers can expect real time and predictive BI, social capabilities, choice of deployment, and more productivity because of a next generation UX called FUSE. Chris's demo showed a super easy, new UI that touts self service navigation. We'll blog about FUSE in the very near future. Chris said the next 365 days of Fusion Apps would include more localization, more industries, more power, more mobile, and more configurability. The audience was challenged to think hard about how Fusion could be part of their three-to-five year plans. Chris set up a great opportunity for you to follow up with your customers as they explore the possibilities.

    Read the article

  • SQLAuthority News – Technical Review of Learning at Koenig Solutions

    - by pinaldave
    Yesterday I finished my 3 days fast track in person learning of course End to End SQL Server Business Intelligence at Koenig Solutions. You can read my previous article over here regarding why am I learning SQL Server. Yesterday I blogged about my experience of arriving to Training Center and my induction with the center. The Training Days I had enrolled for three days training so my routine each of the three days was very much same. However, the content every day was different as I was learning something new every day. Let me describe a few of the interesting details of my daily routine. A Single Student Batch The best part of my training was that in my training batch, I am single student. Koenig is known to smaller batches and often they have single student batches as well. I was very much delighted to know that I will have dedicated access and attention from my trainer in my batch as I will be single student in my batch. In most of the labs I have observed there are no more than 4 students at any time. Prakash and Pinal 7:30 AM Breakfast Talk We all students gather at 7:30 in breakfast area. The best time of the day. I was the only Indian student in the group. The other students were from USA, Canada, Nigeria, Bhutan, Tanzania, and a few others from other countries. I immediately become the source of information and reference manual. Though the distance between Delhi and Bangalore is 2000+ KM I was considered as a local guy. 8:30 AMHeading to Training Center Every day without fail at 8:30 the van started from our accommodation to the training center. As mentioned in an earlier blog post the distance is about 5 minutes and we were able to reach at the location before 8:45. This gave us some time settle in before our class starts at 9:00 AM. 9:00 AM Order Lunch Food Well it may sound funny that we just had breakfast 30 minutes but the first thing everybody has to do is to order lunch as soon as the class starts. There is an online training portal to order food for the day. Everybody has to place their order early during the day so the food arrives on time during lunch time. Everybody can order whatever they want to order using an online ordering system. The options are plenty and everybody can order what they like. 9:05 AM Learning Starts After deciding the lunch we started the learning. I was very fortunate to have a very experienced trainer - Prakash Chheatry. Though I have never met him before I have heard a lot about Prakash. He is known as the top most SQL Server Trainer in India. His student list contains some of the very well known SQL Server Experts of the world and few of SQL Server “best seller” book authors. Learning continues till 1:00 PM with one tea-coffee break in between. 1:00 PM Lunch The lunch time is again the fun time. We all students get together in the afternoon and tell the stories of the world. Indeed the best part of the day beside learning new stuff. 4:55 PM Ready to Return We stop at 4:55 as at precisely 5:00 PM the van stops by the institute which takes us back to our accommodation. Trust me seriously long long day always but the amount of the learning is the win of the day. 7:30 PM Dinner Time After coming back to the accommodation I study till 7:30 and then rush for dinner. Dinner is world cuisine and deserts are really delicious. After dinner every day I have written a blog and retired early as the next day is always going to be busier than the present day. What did I learn As I mentioned earlier I know SQL Server fairly well. I had expressed the same in my conversation as well. This is the reason I was assigned a fairly senior trainer and we learned everything quite quickly. As I know quite a few things we went pretty fast in many topics. There were a few things, I wanted to learn in detail as well practice on the labs. We slowed down where we wanted and rush through the concepts where I was very comfortable. Here is the list of the things which we covered in action pack three days. Introduction to Business Intelligence (Intro) SQL Server Analysis Service (Theory and Lab) SQL Server Integration Service  (Theory and Lab) SQL Server Reporting Service  (Theory and Lab) SQL Server PowerPivot (Lab) UDM (Theory) SharePoint Concepts (Theory) Power View (Demo) Business Intelligence and Security (Discussion) Well, I was delighted that I was able to refresh lots of concepts during these three days. Thanks to my trainer and my friend who helped me to have a good learning experience. I believe all the learning  will help me in my growth and future career. With this I end my this experience. I am planning to have another online learning experience later this month. I will blog about my experience as I begin it. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, T SQL, Technology

    Read the article

  • West Palm Beach .Net User Group with Chris Eargle - February 22nd, 2011

    - by Sam Abraham
    Chris Eargle, Telerik Evangelist, Microsoft MVP and INETA Speaker, was our guest speaker at the West Palm Beach .Net User Group February 2011 meeting.   Chris shared many advanced C#  tricks that he learned throughout his many years of programming in a talk earning raving reviews from all attendees.   At the end of our event, we had a free raffle of 2 Telerik Ultimate Collection licenses and various .Net Ninja shirts.   We would like to thank Chris for sharing with us and we look forward to having him again at our group at his earliest convenience.   Below are some pictures of the event:

    Read the article

  • Chris Brook-Carter at the Oracle Retail Week Awards VIP Reception

    - by user801960
    The Oracle VIP Reception at the Oracle Retail Week Awards last week saw retail luminaries from around the UK and Europe gather to have a drink and celebrate the successes of retail in the last year. Guests included Lord Harris of Peckham, Tesco's Philip Clarke, Vanessa Gold from Ann Summers, former Retail Week editor Tim Danaher, Richard Pennycook from Morrisons and Ian Cheshire from Kingfisher Group. The new Retail Week editor-in-chief, Chris Brook-Carter, attended and took the time to speak to the guests about the value of the Oracle Retail Week Awards to the industry and to thank Oracle for its dedication to supporting the industry. Chris said: "I'd like to say a real heartfelt thanks to our partner this evening: Oracle. I had the privilege of being at the judging day and I got to meet Sarah and the team and I was struck by not only the passion that they have for the whole awards system and everything that means in terms of rewarding excellence within the retail industry but also their commitment to retail in general, and it's that sort of relationship that marks out retail as such a fantastic sector to be involved in." Chris's speech can be watched in full below:

    Read the article

  • Tuesday 6th Manchester SQL User Group - Chris Testa-O'Neil (Loading a datawarehouse using SSIS) and

    - by tonyrogerson
    Chris will give a talk on Loading a datawarehouse using SQL Server Integration Services, Tony Rogerson will give a talk on Database Design: Normalisation/Denormalisation and using Surrogate Keys - practicalities/pitfalls and benefits in Microsoft SQL Server. Registration is essential which you can do here: http://sqlserverfaq.com?eid=218 . Come and join us for an evening of SQL Server discussion, as well as the two formal sessions by Chris Testa-O'Neil and Tony Rogerson there will be a chance...(read more)

    Read the article

  • Tellago 2011: Dwight, Chris and Don are MVPs

    - by gsusx
    It’s been a great start of 2011. Tellago’s Dwight Goins has been awarded as a Microsoft BizTalk Server MVP for 2011. I’ve always said that Dwight should have been an MVP a long time ago. His contributions to the BizTalk Server community are nothing but remarkable. In addition to Dwight, my colleagues Don Demsak and Chris Love also renewed their respective MVP award. A few other of us are up for renewal later in the year. As a recognition to Dwight’s award, we have made him the designated doorman...(read more)

    Read the article

  • Umbraco Certified Developer - Level 2 - Chris Houston

    - by Vizioz Limited
    I just thought I'd create a quick blog post to say that I have now been on the Umbraco Level 2 course (which I would recommend!) and although it turned out that I pretty much new 95% of what was taught, the extra 5% and a chance to have a trip to Copenhagen made it worth it :)I am now officially Umbraco level 2 certified :)Hopefully over the next month I will have some time to start adding a few more useful blog posts to my blog. I know I've been a little slack on the posting in the last month, it's just been a busy time for me!

    Read the article

  • Oracle Australia Supports MS Sydney to Gong Ride by Chris Sainsbury

    - by user769227
    What is the Sydney to Gong Ride? The Gong Ride is a one of a kind fundraising event. You can pedal 90 km from Sydney to Wollongong on any day of the year but it's only on the first Sunday of November that you'll experience the camaraderie, fellowship, unity, safey, scenery and sense of achievement for pedalling in support of people living with MS. Well done to the 22 members of the Oracle Sydney to Gong ride on Sunday 6 November. For many, this was the first time riding over distance – officially a 90km event, by GPS about 84km. The event started in Sydney Park, Newtown. We left in a few separate groups between 6.30 and 7.30am – and finished with times between 2 hours 45 mins and 6 hours. With 10,000 riders there was a lot of congestion at the start but that soon thinned out as we left Sydney. It was a great spring day for the event but at 34 degrees it was getting pretty warm once we left the shade of the Royal National Park and carried on over the Sea Cliff bridge and down the coast road towards Wollongong. Unfortunately Dan managed to get himself a facial scrub when someone clipped his front wheel on the descent from Bald Hill lookout. No major incidents thankfully and Dan soldiered on. Most importantly everyone had a good time (even Dan) and we raised $5,800 for Multiple Sclerosis Australia. In total more than $3.7m was raised for this good cause.

    Read the article

  • Oracle Desktop Virtualization at HIMSS 2011

    - by chris.kawalek(at)oracle.com
    The HIMSS Conference is an extremely important industry trade show put on by The Healthcare Information and Management Systems Society. It's being held in Florida starting this Sunday, February 20th. Their slogan, "Linking people, potential, and progress" could be true of Oracle desktop virtualization as well! The Oracle desktop virtualization group has worked very closely with the Oracle healthcare business unit to have a large presence at this show, and I wanted to tell you a bit about what we're doing: - All Oracle demos are being done on Sun Ray Clients That's right, every demo pod in the large Oracle booth will have a Sun Ray Client with each demo tied to a smart card. Too many people at your demo station? Pop your card out and go to a different one. We'll also be demoing Oracle desktop virtualization at a dedicated demo station, too. This is great stuff! Find Oracle at booth #1651 Oracle's page about HIMSS - Focus Group - Caregiver Mobility with Oracle Sun Ray Clients and Desktop Virtualization Feb 22, 3:15-4:15 PM This focus group will be for customers interested in Oracle desktop virtualization. It's invitation only, but you can comment on this blog post and we can give you info on how to attend (your comment won't be made public). - Solution Session - Fast, Secure, Workflow Optimized: Inexpensive Access to Care Information is Possible Inside and Outside of the Hospital Feb 23, 4:15 PM Booth #685, Wireless and Mobility Theatre Oracle's Adam Workman will cover caregiver mobility and the benefits of Oracle desktop virtualization to healthcare organizations. - New healthcare solutions page on oracle.com We've created a page dedicated to content involving desktop virtualization and healthcare. This will be your onestop shop if looking for desktop virtualization and healthcare information. - New desktop virtualization and healthcare solution data sheet This document outlines how we define "Caregiver Mobility" and how Oracle products are used to facilitate quicker, more secure access to patient data. We'll have some more updates from the show next week. It looks like its going to be an exciting event! -Chris

    Read the article

  • Two Virtualization Webinars This Week

    - by chris.kawalek(at)oracle.com
    If you're interested in virtualization, be sure to catch our two free webinars this week. You'll hear directly from Oracle technologists and can ask questions in a live Q&A. Deploying Oracle VM Templates for Oracle E-Business Suite and Oracle PeopleSoft Enterprise Applications Tuesday, Feb 15, 2011 9AM Pacific Time Register Now Is your company trying to manage costs; meet or beat service level agreements and get employees up and running quickly on business-critical applications like Oracle E-Business Suite and Oracle PeopleSoft Enterprise Applications? The fastest way to get the benefits of these applications deployed in your organization is with Oracle VM Templates. Cut application deployment time from weeks to just hours or days. Attend this session for the technical details of how your IT department can deliver rapid software deployment and eliminate installation and configuration costs by providing pre-installed and pre-configured software images. Increasing Desktop Security for the Public Sector with Oracle Desktop Virtualization Thursday, Feb 17, 2011 9AM Pacific Time Register Now Security of data as it moves across desktop devices is a concern for all industries. But organizations such as law enforcement, local, state, and federal government and others have higher security ne! eds than most. A virtual desktop model, where no data is ever stored on the local device, is an ideal architecture for these organizations to deploy. Oracle's comprehensive portfolio of desktop virtualization solutions, from thin client devices, to sever side management and desktop hosting software, provide a complete solution for this ever-increasing problem.

    Read the article

  • The perfect DotNetNuke Christmas present

    - by Chris Hammond
    Are you racking your brain trying to come up with that DotNetNuke person in your life? If so, I’ve got just the solution! You can buy them my book! DotNetNuke 5 User’s Guide: Get your website up and running ! It’s the perfect item for the DotNetNuke love of your life. If you buy a copy and want it signed, I’ll even offer to sign it if you mail it to me. Please be sure to include postage both ways. You probably won’t be able to get it to me and back in time for Christmas but the signing can happen...(read more)

    Read the article

  • HIMSS 2011 and New Press Release

    - by chris.kawalek(at)oracle.com
    We're here at HIMSS 2011 in booth 1651. If you're at the show, tomorrow (Wednesday) is the final day for the exhibits, so come over and see all of the Oracle demos displayed on Sun Ray Clients. It's extremely cool! Also, we did a press release here at the show about caregiver mobility with Wolf Medical Software. Have a read here. Wolf Medical Software did a press release themselves, too. You can read their press release here.

    Read the article

  • SQL SERVER – Introduction to SQL Server 2014 In-Memory OLTP

    - by Pinal Dave
    In SQL Server 2014 Microsoft has introduced a new database engine component called In-Memory OLTP aka project “Hekaton” which is fully integrated into the SQL Server Database Engine. It is optimized for OLTP workloads accessing memory resident data. In-memory OLTP helps us create memory optimized tables which in turn offer significant performance improvement for our typical OLTP workload. The main objective of memory optimized table is to ensure that highly transactional tables could live in memory and remain in memory forever without even losing out a single record. The most significant part is that it still supports majority of our Transact-SQL statement. Transact-SQL stored procedures can be compiled to machine code for further performance improvements on memory-optimized tables. This engine is designed to ensure higher concurrency and minimal blocking. In-Memory OLTP alleviates the issue of locking, using a new type of multi-version optimistic concurrency control. It also substantially reduces waiting for log writes by generating far less log data and needing fewer log writes. Points to remember Memory-optimized tables refer to tables using the new data structures and key words added as part of In-Memory OLTP. Disk-based tables refer to your normal tables which we used to create in SQL Server since its inception. These tables use a fixed size 8 KB pages that need to be read from and written to disk as a unit. Natively compiled stored procedures refer to an object Type which is new and is supported by in-memory OLTP engine which convert it into machine code, which can further improve the data access performance for memory –optimized tables. Natively compiled stored procedures can only reference memory-optimized tables, they can’t be used to reference any disk –based table. Interpreted Transact-SQL stored procedures, which is what SQL Server has always used. Cross-container transactions refer to transactions that reference both memory-optimized tables and disk-based tables. Interop refers to interpreted Transact-SQL that references memory-optimized tables. Using In-Memory OLTP In-Memory OLTP engine has been available as part of SQL Server 2014 since June 2013 CTPs. Installation of In-Memory OLTP is part of the SQL Server setup application. The In-Memory OLTP components can only be installed with a 64-bit edition of SQL Server 2014 hence they are not available with 32-bit editions. Creating Databases Any database that will store memory-optimized tables must have a MEMORY_OPTIMIZED_DATA filegroup. This filegroup is specifically designed to store the checkpoint files needed by SQL Server to recover the memory-optimized tables, and although the syntax for creating the filegroup is almost the same as for creating a regular filestream filegroup, it must also specify the option CONTAINS MEMORY_OPTIMIZED_DATA. Here is an example of a CREATE DATABASE statement for a database that can support memory-optimized tables: CREATE DATABASE InMemoryDB ON PRIMARY(NAME = [InMemoryDB_data], FILENAME = 'D:\data\InMemoryDB_data.mdf', size=500MB), FILEGROUP [SampleDB_mod_fg] CONTAINS MEMORY_OPTIMIZED_DATA (NAME = [InMemoryDB_mod_dir], FILENAME = 'S:\data\InMemoryDB_mod_dir'), (NAME = [InMemoryDB_mod_dir], FILENAME = 'R:\data\InMemoryDB_mod_dir') LOG ON (name = [SampleDB_log], Filename='L:\log\InMemoryDB_log.ldf', size=500MB) COLLATE Latin1_General_100_BIN2; Above example code creates files on three different drives (D:  S: and R:) for the data files and in memory storage so if you would like to run this code kindly change the drive and folder locations as per your convenience. Also notice that binary collation was specified as Windows (non-SQL). BIN2 collation is the only collation support at this point for any indexes on memory optimized tables. It is also possible to add a MEMORY_OPTIMIZED_DATA file group to an existing database, use the below command to achieve the same. ALTER DATABASE AdventureWorks2012 ADD FILEGROUP hekaton_mod CONTAINS MEMORY_OPTIMIZED_DATA; GO ALTER DATABASE AdventureWorks2012 ADD FILE (NAME='hekaton_mod', FILENAME='S:\data\hekaton_mod') TO FILEGROUP hekaton_mod; GO Creating Tables There is no major syntactical difference between creating a disk based table or a memory –optimized table but yes there are a few restrictions and a few new essential extensions. Essentially any memory-optimized table should use the MEMORY_OPTIMIZED = ON clause as shown in the Create Table query example. DURABILITY clause (SCHEMA_AND_DATA or SCHEMA_ONLY) Memory-optimized table should always be defined with a DURABILITY value which can be either SCHEMA_AND_DATA or  SCHEMA_ONLY the former being the default. A memory-optimized table defined with DURABILITY=SCHEMA_ONLY will not persist the data to disk which means the data durability is compromised whereas DURABILITY= SCHEMA_AND_DATA ensures that data is also persisted along with the schema. Indexing Memory Optimized Table A memory-optimized table must always have an index for all tables created with DURABILITY= SCHEMA_AND_DATA and this can be achieved by declaring a PRIMARY KEY Constraint at the time of creating a table. The following example shows a PRIMARY KEY index created as a HASH index, for which a bucket count must also be specified. CREATE TABLE Mem_Table ( [Name] VARCHAR(32) NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT = 100000), [City] VARCHAR(32) NULL, [State_Province] VARCHAR(32) NULL, [LastModified] DATETIME NOT NULL, ) WITH (MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_AND_DATA); Now as you can see in the above query example we have used the clause MEMORY_OPTIMIZED = ON to make sure that it is considered as a memory optimized table and not just a normal table and also used the DURABILITY Clause= SCHEMA_AND_DATA which means it will persist data along with metadata and also you can notice this table has a PRIMARY KEY mentioned upfront which is also a mandatory clause for memory-optimized tables. We will talk more about HASH Indexes and BUCKET_COUNT in later articles on this topic which will be focusing more on Row and Index storage on Memory-Optimized tables. So stay tuned for that as well. Now as we covered the basics of Memory Optimized tables and understood the key things to remember while using memory optimized tables, let’s explore more using examples to understand the Performance gains using memory-optimized tables. I will be using the database which i created earlier in this article i.e. InMemoryDB in the below Demo Exercise. USE InMemoryDB GO -- Creating a disk based table CREATE TABLE dbo.Disktable ( Id INT IDENTITY, Name CHAR(40) ) GO CREATE NONCLUSTERED INDEX IX_ID ON dbo.Disktable (Id) GO -- Creating a memory optimized table with similar structure and DURABILITY = SCHEMA_AND_DATA CREATE TABLE dbo.Memorytable_durable ( Id INT NOT NULL PRIMARY KEY NONCLUSTERED Hash WITH (bucket_count =1000000), Name CHAR(40) ) WITH (MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_AND_DATA) GO -- Creating an another memory optimized table with similar structure but DURABILITY = SCHEMA_Only CREATE TABLE dbo.Memorytable_nondurable ( Id INT NOT NULL PRIMARY KEY NONCLUSTERED Hash WITH (bucket_count =1000000), Name CHAR(40) ) WITH (MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_only) GO -- Now insert 100000 records in dbo.Disktable and observe the Time Taken DECLARE @i_t bigint SET @i_t =1 WHILE @i_t<= 100000 BEGIN INSERT INTO dbo.Disktable(Name) VALUES('sachin' + CONVERT(VARCHAR,@i_t)) SET @i_t+=1 END -- Do the same inserts for Memory table dbo.Memorytable_durable and observe the Time Taken DECLARE @i_t bigint SET @i_t =1 WHILE @i_t<= 100000 BEGIN INSERT INTO dbo.Memorytable_durable VALUES(@i_t, 'sachin' + CONVERT(VARCHAR,@i_t)) SET @i_t+=1 END -- Now finally do the same inserts for Memory table dbo.Memorytable_nondurable and observe the Time Taken DECLARE @i_t bigint SET @i_t =1 WHILE @i_t<= 100000 BEGIN INSERT INTO dbo.Memorytable_nondurable VALUES(@i_t, 'sachin' + CONVERT(VARCHAR,@i_t)) SET @i_t+=1 END The above 3 Inserts took 1.20 minutes, 54 secs, and 2 secs respectively to insert 100000 records on my machine with 8 Gb RAM. This proves the point that memory-optimized tables can definitely help businesses achieve better performance for their highly transactional business table and memory- optimized tables with Durability SCHEMA_ONLY is even faster as it does not bother persisting its data to disk which makes it supremely fast. Koenig Solutions is one of the few organizations which offer IT training on SQL Server 2014 and all its updates. Now, I leave the decision on using memory_Optimized tables on you, I hope you like this article and it helped you understand  the fundamentals of IN-Memory OLTP . Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Koenig

    Read the article

  • C++ operator lookup rules / Koenig lookup

    - by John Bartholomew
    While writing a test suite, I needed to provide an implementation of operator<<(std::ostream&... for Boost unit test to use. This worked: namespace theseus { namespace core { std::ostream& operator<<(std::ostream& ss, const PixelRGB& p) { return (ss << "PixelRGB(" << (int)p.r << "," << (int)p.g << "," << (int)p.b << ")"); } }} This didn't: std::ostream& operator<<(std::ostream& ss, const theseus::core::PixelRGB& p) { return (ss << "PixelRGB(" << (int)p.r << "," << (int)p.g << "," << (int)p.b << ")"); } Apparently, the second wasn't included in the candidate matches when g++ tried to resolve the use of the operator. Why (what rule causes this)? The code calling operator<< is deep within the Boost unit test framework, but here's the test code: BOOST_AUTO_TEST_SUITE(core_image) BOOST_AUTO_TEST_CASE(test_output) { using namespace theseus::core; BOOST_TEST_MESSAGE(PixelRGB(5,5,5)); // only compiles with operator<< definition inside theseus::core std::cout << PixelRGB(5,5,5) << "\n"; // works with either definition BOOST_CHECK(true); // prevent no-assertion error } BOOST_AUTO_TEST_SUITE_END() For reference, I'm using g++ 4.4 (though for the moment I'm assuming this behaviour is standards-conformant).

    Read the article

  • SQLAuthority News – Learning Trip – Traveling to Learn SQL Server

    - by pinaldave
    I am currently traveling to Delhi to learn SQL Server in person from my friend. You can read more details about why am I learning SQL Server.  I have signed up for the course End to End SQL Server Business Intelligence at Koenig Solutions. Yesterday I blogged about my registration experience and today I am going to write about my  experience once I arrived at Delhi. From Ahmedabad to Delhi I stay with my wife and daughter in Bangalore (IT Hub of India), my hometown is Ahmedabad. My parents stay in city nearby Ahmedabad. I decided to spend few days with my folks before I sign up for 3 days of solid learning. I had selected an early morning flight to Delhi. I landed at 8:30 AM in Delhi. As soon as I checked email in my mobile I was really glad that I had received details of my pick up vehicle from Koenig. I walked out of the airport and I noticed that a driver was waiting with a placard with my name and photo associated with it. He was in Koenig uniform so there was no chance to make mistakes. In minutes of landing in Delhi I was in my transport heading to the Koenig Training Center. After the quick introduction driver handed me a bag (to be precise Eco friendly bag). The bag contained following items: My registration form All necessary documents in print which I had received earlier A Printed Book of the course next day INR 1000 (What?) I was glad to receive the bag but I was very confused with the Rs 1000. I decided to figure this out once I reach to the training center. Arriving at Koenig Inn Deluxe Koenig registration fees include all the stay and meals. I had opted for Koenig Inn Deluxe as my stay as it was recommended by my friend as well it was the right economical choice for me. When I reached to my accommodation, they were well aware of my arrival and was immediately led to my spacious room. The room is well equipped with all the amenities (hot water, air condition, coffee table, munching snacks,  and free internet) and the staff is very friendly. I immediately got ready as I had to go to Koenig Training Center to meet Center Head for a quick introduction. Koenig Inn Delux Koenig Training Center The training center is within five minutes of distance from the accommodation. I was lead to center head right away and had a very meaningful conversation with Ms Hema regarding my learning goals. She gave me a quick tour of the training center. I was amazed with the numbers of lab rooms they have in the center. The labs are spacious and give the most needed hand’s on experience to the users. I was led to the lab where I was suppose to learn my class the very next day as well I was provided my trainer’s profile. Mystery of Rs 1000 Well, after all this I have still not forgotten why I was provided Rs 1000 when arrived at the airport. When I asked about that I was told that because many students comes from foreign places and they may not have Indian Currency when they land at airport. This was for their immediate consumption till they arrive at the training center. Later on they can get their currency converted to local currency at Koenig Travel Desk. My curiosity was satisfied but I had not expected this answer. I am amazed at the attention to the details. Koenig Travel Desk When I heard about Koenig Travel Desk, I remembered that I have few friends in Delhi and Gurgaon. I had completed all of the formalities so I had reset of the day on my hand. I requested the travel desk if they can arrange a day cab for me so I can visit my friends in Guragon. Within 10 minutes I was on my way to Gurgaon. Telerik India Office Visit What did I do in Guragaon? I met my friends Abhishek Kant, Dhananjay Kumar and Amit Chowdhary. I visited Telerik India office and we had an excellent conversation on various aspects of technology and community. The Telerik India office is very spacious and Abhishek Kant (Telerik India Country Manager) gave us a quick tour of the office. We had an excellent lunch and dinner. One thing is for sure – the day was well spent. Pinal Dave, Dhananjay Kumar and Abhishek Kant Later evening I returned to my accommodation and decided to read up a few of the topics which I was going to learn next day. In tomorrow’s blog post I will discuss about my learning experience. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, T SQL, Technology

    Read the article

  • No height using Chris Coyier full width hack

    - by ftntravis
    I'm using Chris Coyier's full width hack on a site I am building, but stumped on how I can get the div to have the height of whatever it is containing. Usually I would achieve this by adding overflow:auto to the container, but if I do that it breaks the hack. Is it possible to achieve a height and still use this hack? You can see my problem here: http://beta.revival.tv/ Here is my CSS: #content-wrap:before, #content-wrap:after {content: ""; position: absolute; top: 0; bottom: 0; width: 9999px;} #content-wrap:before {right: 100%;} #content-wrap:after {left: 100%;} #content-wrap, #content-wrap:before, #content-wrap:after {background:#666;} #content-wrap { position: relative; width:1000px; margin:0 auto; padding:25px 0; }

    Read the article

  • Unable to ssh out anywhere - ssh_exchange_identification

    - by Chowlett
    I have a setup where I'm running Ubuntu 11.10 as a VirtualBox guest under a Windows 7 host, behind a restrictive corporate firewall. I have set up NAT from the host port 22 to Ubuntu's port 22; IT inform me that they have opened port 22 outbound for the host machine's IP address. I have run ssh-keygen -t rsa, and am trying to test the setup by connecting to github and another known ssh server. In both cases the connect is refused with ssh_exchange_identification: Connection closed by remote host. Full -vvv log is below. Is this possibly still due to the corporate firewall? If so, what else might I need to request from them? Any other ideas what might be wrong and how to fix it? ~$ ssh -Tvvv [email protected] OpenSSH_5.8p1 Debian-7ubuntu1, OpenSSL 1.0.0e 6 Sep 2011 debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for * debug2: ssh_connect: needpriv 0 debug1: Connecting to github.com [207.97.227.239] port 22. debug1: Connection established. debug3: Incorrect RSA1 identifier debug3: Could not load "/home/chris/.ssh/id_rsa" as a RSA1 public key debug2: key_type_from_name: unknown key type '-----BEGIN' debug3: key_read: missing keytype debug2: key_type_from_name: unknown key type 'Proc-Type:' debug3: key_read: missing keytype debug2: key_type_from_name: unknown key type 'DEK-Info:' debug3: key_read: missing keytype debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug2: key_type_from_name: unknown key type '-----END' debug3: key_read: missing keytype debug1: identity file /home/chris/.ssh/id_rsa type 1 debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048 debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048 debug1: identity file /home/chris/.ssh/id_rsa-cert type -1 debug1: identity file /home/chris/.ssh/id_dsa type -1 debug1: identity file /home/chris/.ssh/id_dsa-cert type -1 debug1: identity file /home/chris/.ssh/id_ecdsa type -1 debug1: identity file /home/chris/.ssh/id_ecdsa-cert type -1 ssh_exchange_identification: Connection closed by remote host Edit: Requested diagnostics: ~$ ls -la ~/.ssh total 16 drwx------ 2 chris chris 4096 2012-03-30 13:12 . drwxr-xr-x 29 chris chris 4096 2012-03-30 13:25 .. -rw------- 1 chris chris 1766 2012-03-30 13:12 id_rsa -rw-r--r-- 1 chris chris 409 2012-03-30 13:12 id_rsa.pub

    Read the article

  • SQL SERVER – Get 2 of My Books FREE at Koenig Tech Day – Where Technologies Converge!

    - by pinaldave
    As a regular reader of my blog – you must be aware of that I love to write books and talk about various subjects of my book. The founders of Koenig Solutions are my very old friends, I know them for many years. They have been my biggest supporter of my books. Coming weekend they have a technology event at their Bangalore Location. Every attendee of the technology event will get a set of two books worth Rs. 450 – ‘SQL Server Interview Questions And Answers‘ and ‘SQL Wait Stats Joes 2 Pros‘. I am going to cover a couple of topics of the books and present  as well. I am very confident that every attendee will be having a great time. I will be covering following subjects: SQL Server Tricks and Tips for Blazing Fast Performance Slow Running Queries (SQL) are the most common problem that developers face while working with SQL Server. While it is easy to blame the SQL Server for unsatisfactory performance, however the issue often persists with the way queries have been written, and how SQL Server has been set up. The session will focus on the ways of identifying problems that slow down SQL Servers, and tricks to fix them. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session. After the session is over – I will point to what exact location in the book where you can continue for the further learning. I am pretty excited, this is more like book reading but in entire different format. The one day event will cover four technologies in four separate interactive sessions on: Microsoft SQL Server Security VMware/Virtualization ASP.NET MVC Date of the event: Dec 15, 2012 9 AM to 6PM. Location of the event:  Koenig Solutions Ltd. # 47, 4th Block, 100 feet Road, 3rd Floor, Opp to Shanthi Sagar, Koramangala, Bangalore- 560034 Mobile : 09008096122 Office : 080- 41127140 Organizers have informed me that there are very limited seats for this event and technical session based on my book will start at Sharp 9 AM. If you show up late there are chances that you will not get any seats. Registration for the event is a MUST. Please visit this link for further information. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • Why won't ruby recognize Haml under ubuntu64 while using jekyll static blog generator?

    - by oldmanjoyce
    I have been trying, quite unsuccessfully, to run henrik's fork of the jekyll static blog generator on Ubuntu 64-bit. I just can't seem to figure this out and I've tried a bunch of different things. Originally I posted this over at stackoverflow, but this is probably the better spot for it. The base stats of my machine: Ubuntu 9.04, 64 bit, ruby 1.8.7 (2008-08-11 patchlevel 72) [x86_64-linux], rubygems 1.3.1. When I attempt to build the site, this is what happens: $ jekyll --pygments Configuration from ./_config.yml Using Sass for CSS generation You must have the haml gem installed first Using rdiscount for Markdown Building site: . - ./_site /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/core_ext.rb:27:in `method_missing': undefined method 'header' for #, page=# ..... cut ..... (NoMethodError) from (haml):9:in `render' from /home/chris/.gem/gems/haml-2.2.3/lib/haml/engine.rb:167:in 'render' from /home/chris/.gem/gems/haml-2.2.3/lib/haml/engine.rb:167:in 'instance_eval' from /home/chris/.gem/gems/haml-2.2.3/lib/haml/engine.rb:167:in 'render' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/convertible.rb:72:in 'render_haml_in_context' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/convertible.rb:105:in 'do_layout' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/post.rb:226:in 'render' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/site.rb:172:in 'read_posts' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/site.rb:171:in 'each' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/site.rb:171:in 'read_posts' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/site.rb:210:in 'transform_pages' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/../lib/jekyll/site.rb:126:in 'process' from /home/chris/.gem/gems/henrik-jekyll-0.5.2/bin/jekyll:135 from /home/chris/.gem/bin/jekyll:19:in `load' from /home/chris/.gem/bin/jekyll:19 I added spaces to the left of the ClosedStruct to enable better visibility - sorry that my inline html/formatting isn't perfect. I also cut out some middle text that is just data. $ gem list *** LOCAL GEMS *** actionmailer (2.3.4) actionpack (2.3.4) activerecord (2.3.4) activeresource (2.3.4) activesupport (2.3.4) classifier (1.3.1) directory_watcher (1.2.0) haml (2.2.3) haml-edge (2.3.27) henrik-jekyll (0.5.2) liquid (2.0.0) maruku (0.6.0) open4 (0.9.6) rack (1.0.0) rails (2.3.4) rake (0.8.7) rdiscount (1.3.5) RedCloth (4.2.2) stemmer (1.0.1) syntax (1.0.0) Some showing for path verification: $ echo $PATH /home/chris/.gem/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games $ which haml /home/chris/.gem/bin/haml $ which jekyll /home/chris/.gem/bin/jekyll

    Read the article

  • Tulsa - Launch 2010 Highlight Events

    - by dmccollough
    Tuesday May 04, 2010 Renaissance Tulsa Hotel and Convention Center Seville II and III 6808 S. 107th East Avenue Tulsa Oklahoma 74133   For the Developer 1:00 PM – 5:00 PM Event Overview MSDN Events Present:  Launch 2010 Highlights Join your local Microsoft Developer Evangelism team to find out first-hand about how the latest features in Microsoft® Visual Studio® 2010 can help boost your development creativity and performance.  Learn how to improve the process of refactoring your existing code base and drive tighter collaboration with testers. Explore innovative web technologies and frameworks that can help you build dynamic web applications and scale them to the cloud. And, learn about the wide variety of rich application platforms that Visual Studio 2010 supports, including Windows 7, the Web, Windows Azure, SQL Server, and Windows Phone 7 Series.   Click here to register.   For the IT Professional 8:00 AM – 12:00 PM Event Overview TechNet Events Present:  Launch 2010 Highlights Join your local Microsoft IT Pro Evangelism team to find out first-hand what Microsoft® Office® 2010 and SharePoint® 2010 mean for the productivity of you and your people—across PC, phone, and browser.  Learn how this latest wave of technologies provides revolutionary user experience and how it takes us into a future of greater productivity.  Come and explore the tools that will help you optimize desktop deployment.   Click here to register.

    Read the article

  • Programming logic to group a users activities like facebook. E.g. Chris is now friends with A, B and C

    - by Chris Dowdeswell
    So I am trying to develop an activity feed for my site, Basically If I UNION a bunch of activities into a feed I would end up with something like the following. Chris is now friends with Mark Chris is now friends with Dave What I want though is a neater way of grouping these similar posts so the feed doesn't give information overload... E.g. Chris is now friends with Mark, Dave and 4 Others Any ideas on how I can approach this logically? I am using Classic ASP on SQL server. Here is the UNION statement I have so far... SELECT U.UserID As UserID, L.UN As UN,Left(U.UID,13) As ProfilePic,U.Fname + ' ' + U.Sname As FullName, 'said ' + WP.Post AS Activity, WP.Ctime FROM Users AS U LEFT JOIN Logins L ON L.userID = U.UserID LEFT OUTER JOIN WallPosts AS WP ON WP.userID = U.userID WHERE WP.Ctime IS NOT NULL UNION SELECT U.UserID As UserID, L.UN As UN,Left(U.UID,13) As ProfilePic,U.Fname + ' ' + U.Sname As FullName, 'commented ' + C.Comment AS Activity, C.Ctime FROM Users AS U LEFT JOIN Logins L ON L.userID = U.UserID LEFT OUTER JOIN Comments AS C ON C.UserID = U.userID WHERE C.Ctime IS NOT NULL UNION SELECT U.UserID As UserID, L.UN As UN,Left(U.UID,13) As ProfilePic, U.Fname + ' ' + U.Sname As FullName, 'connected with <a href="/profile.asp?un='+(SELECT Logins.un FROM Logins WHERE Logins.userID = Cn.ToUserID)+'">' + (SELECT Users.Fname + ' ' + Users.Sname FROM Users WHERE userID = Cn.ToUserID) + '</a>' AS Activity, Cn.Ctime FROM Users AS U LEFT JOIN Logins L ON L.userID = U.UserID LEFT OUTER JOIN Connections AS Cn ON Cn.UserID = U.userID WHERE CN.Ctime IS NOT NULL

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >