Search Results

Search found 19441 results on 778 pages for 'static libraries'.

Page 196/778 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • APress Deal of the Day 28/May/2014 - Pro jQuery 2.0

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2014/05/28/apress-deal-of-the-day-28may2014---pro-jquery-2.0.aspxToday’s $10 Deal of the Day from APress at http://www.apress.com/9781430263883 is Pro jQuery 2.0. “jQuery is one of the most popular and powerful JavaScript libraries available today. Learn how to get the most from the latest jQuery version, jQuery 2.0, by focusing on the features you need for your project”

    Read the article

  • How do I put YAML code in GitHub MarkDown?

    - by rutherford
    Following YAML snippet that I'm trying to place in my markdown formatted readme: ``` libraries: - name: numpy version: "1.6.1" ``` Tried the above with both no and 2 trailing spaces at the end of each line. Both turn out on a single line. Obviously I want them on 3 separate lines in my README.md. How? edit: note tried adding 'yaml' as the optional lang parameter but no dice either.

    Read the article

  • iPhone app development pricing [closed]

    - by AlexMorley-Finch
    I currently manage a website design and development company that integrates itself with print work too. I've been co manager for a couple of months now, however business is slow at the moment. I got an email today from a potential client asking if we do iPhone app development. Obviously we do not. But, seeing as there aren't many other project on the go at the moment, why not give it a go? Searching Google was my first option. I did a bit of background scrubbing. Discovered that I need certain developer tools etc. This can be arranged easily enough. The reason I came to Stack Exchange is for answers to these questions. I am a competant programmer and have had experience in Java, PHP, JavaScript, C++ and C#. So how long would it take me to come to grips with Objective-C? Assuming an 8 hour working day. Not only the language but also the iPhone interface libraries (if any exist) and other native libraries? The potential client said the app will be a catalogue. So I'm assuming either the data will be pulled online or in some kind of local database. I know the requirements are vague but does anyone have any idea of how mug time this would take? To learn the language, design the architecture and code? 40 Hours? 80 Hours? My final question kind of depends on the first two. But obviously the potential client wants a quote. And I have got no idea how much an app costs. I did some research and found a huge range in differences. The majority of the cost would cone from the time to develops the app! So to summarise. How long would it take me to be comfortable coding Objective-C, taking into consideration my past knowledge? I'm assuming a day or two to fully understand but you really don't know. How long would it take me to develop an app? How much should I charge (approx) for the development of this app?

    Read the article

  • Apps Script Office Hours - October 25, 2012

    Apps Script Office Hours - October 25, 2012 - Arun announces an election sample app - soon! Look for the blog post on googleappsdeveloper.blogspot.com - LAX hackathon googleappsdeveloper.blogspot.com - Bill (Google Hangout) asks about ScriptDb. Ikai makes a long analogy about libraries and datastores and offers possible explanations for why certain issues occur, as well as some of the difficulties in working with distributed datastores. From: GoogleDevelopers Views: 48 6 ratings Time: 29:34 More in Science & Technology

    Read the article

  • How would I detect if two 2D arrays of any shape collided?

    - by user2104648
    Say there's two or more moveable objects of any shape in 2D plane, each object has its own 2D boolean array to act as a bounds box which can range from 10 to 100 pixels, the program then reads each pixel from a image that represents it, and appropriatly changes the array to true(pixel has a alpha more then 1) or false(pixel has a alpha less than one). Each time one of these objects moves, what would be the best accurate way to test if they hit another object in Java using as few APIs/libraries as possible?

    Read the article

  • Printing labels from an android app or Java? [on hold]

    - by user3563124
    I've seen others discussing printing labels with a dedicated label printer using Java. Is there any way to print labels using a regular printer with label paper using (optimally) Android libraries/API? If not, is there any way to do this in a non-mobile context using Java? Searching for "Java Label Printing" is only turning up the aforementioned dedicated label printing bits, which aren't quite what I'm looking for.

    Read the article

  • How to convert an html page to pdf using javascript? [closed]

    - by user1439891
    I am developing a project, In that I have a receipt page (this is the html page that I want to convert it into pdf) and I've to print it. While printing that page alignments are not coming properly. If I convert it into pdf, then pdf only will take care of that alignments thus my work will become easy and effective. I was restricted to use either JavaScript or js libraries only to complete this task. Could any of you please help me?

    Read the article

  • The Connection String Reference Site

    - by Yousef_Jadallah
    In this great site http://www.connectionstrings.com/ you can find about 517 connection strings and 120 providers, drivers and class libraries listed in the database. These for Database Servers as well Data Files. Just you need to choose the needed element then you will get all the information that you calling for.   Hope this helps,

    Read the article

  • Disallowed images in the robots.txt of my Joomla site can't be displayed when shared in Facebook

    - by opk
    I have noticed that since I have disallowed images using the robots.txt in my Joomla site, when sharing an article in Facebook, the image will not be displayed. Why is that? Is it indeed related? My robots.txt file: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/

    Read the article

  • What is this thing called?!?! (bottom of screen popup)

    - by NRGdallas
    I am looking for the name and some possible jquery libraries etc for the standard bottom of screen popup bar. Its like a little bar that pops out on the bottom of the screen after X seconds or however really - generally slides up, about 50px or so high, and usually the length of the main container. used for some form of coupon advertisement or various promotion text etc. What is the proper term for this item, and would there be any good references to best-use guidelines?

    Read the article

  • Introducing Dart

    Introducing Dart The Dart project includes a modern scalable language, libraries, and tools to help developers build large complex web applications. Watch this video to learn about the different parts of the Dart project and how it can help you be more productive building high performance web apps. Learn more at dartlang.org From: GoogleDevelopers Views: 2205 69 ratings Time: 04:00 More in Science & Technology

    Read the article

  • Avoiding Heap Contention Among Threads

    Allocating memory from the system heap can be an expensive operation due to a lock used by system runtime libraries to synchronize access to the heap. Contention on this lock can limit the performance benefits from multithreading. Learn how to solve this problem.

    Read the article

  • Avoiding Heap Contention Among Threads

    Allocating memory from the system heap can be an expensive operation due to a lock used by system runtime libraries to synchronize access to the heap. Contention on this lock can limit the performance benefits from multithreading. Learn how to solve this problem.

    Read the article

  • Infragistics CenterSpace Partnership

    Infragistics and CenterSpace Software have created a partnership to provide our customers with sample projects that demonstrate the power of integrating Infragistic’s visualization with CenterSpace Software’s math and statistical libraries.

    Read the article

  • Apps crashing with EXC_BAD_ACCESS when changing to a custom keyboard layout

    - by Adam Lindberg
    I have a custom layout installed (svdvorak_mac6.keylayout). After a reboot another keyboard layout was selected, so I selected my usual one instead. This lead to a lot of apps suddenly starting to crash (Chrome, Skype, Adium etc). I can change to any other built in layout for OS X, but as soon as I choose one custom installed one (either form ~/Library/Keyboard Layouts/ or from /Library/Keyboard Layouts/) the apps crash. The only thing I can remember that I did before the reboot was to install Google's video chat plugin for the browser. Here's the crash report from Adium: Process: Adium [372] Path: /Applications/Adium.app/Contents/MacOS/Adium Identifier: com.adiumX.adiumX Version: 1.4.1 (1.4.1) Code Type: X86 (Native) Parent Process: launchd [182] Date/Time: 2010-12-22 22:39:47.833 +0100 OS Version: Mac OS X 10.6.5 (10H574) Report Version: 6 Interval Since Last Report: 257401 sec Crashes Since Last Report: 39 Per-App Interval Since Last Report: 1178959 sec Per-App Crashes Since Last Report: 8 Anonymous UUID: 7CBACDEB-FBAF-4CD5-9C15-7AEA8AC4B5EF Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x0000000000000000 Crashed Thread: 0 Dispatch queue: com.apple.main-thread Thread 0 Crashed: Dispatch queue: com.apple.main-thread 0 com.apple.HIToolbox 0x99722b6d islGetInputSourceProperty + 1107 1 com.apple.HIToolbox 0x9972262c TSMGetInputSourceProperty + 526 2 com.apple.HIToolbox 0x9972787a _ISSendWindowServerKeyboardLayoutUpdate + 412 3 com.apple.HIToolbox 0x9972622b _TSMSetInputSourceSelected + 1429 4 com.apple.HIToolbox 0x99980209 TSMMessagePortCallBack + 574 5 com.apple.CoreFoundation 0x9226840c __CFMessagePortPerform + 540 6 com.apple.CoreFoundation 0x921d34db __CFRunLoopRun + 6523 7 com.apple.CoreFoundation 0x921d1464 CFRunLoopRunSpecific + 452 8 com.apple.CoreFoundation 0x921d1291 CFRunLoopRunInMode + 97 9 com.apple.HIToolbox 0x99717f58 RunCurrentEventLoopInMode + 392 10 com.apple.HIToolbox 0x99717d0f ReceiveNextEventCommon + 354 11 com.apple.HIToolbox 0x99717b94 BlockUntilNextEventMatchingListInMode + 81 12 com.apple.AppKit 0x9189478d _DPSNextEvent + 847 13 com.apple.AppKit 0x91893fce -[NSApplication nextEventMatchingMask:untilDate:inMode:dequeue:] + 156 14 com.apple.AppKit 0x91856247 -[NSApplication run] + 821 15 com.apple.AppKit 0x9184e2d9 NSApplicationMain + 574 16 com.adiumX.adiumX 0x0000322e 0x1000 + 8750 Thread 1: Dispatch queue: com.apple.libdispatch-manager 0 libSystem.B.dylib 0x968f5982 kevent + 10 1 libSystem.B.dylib 0x968f609c _dispatch_mgr_invoke + 215 2 libSystem.B.dylib 0x968f5559 _dispatch_queue_invoke + 163 3 libSystem.B.dylib 0x968f52fe _dispatch_worker_thread2 + 240 4 libSystem.B.dylib 0x968f4d81 _pthread_wqthread + 390 5 libSystem.B.dylib 0x968f4bc6 start_wqthread + 30 Thread 2: 0 libSystem.B.dylib 0x968f4a12 __workq_kernreturn + 10 1 libSystem.B.dylib 0x968f4fa8 _pthread_wqthread + 941 2 libSystem.B.dylib 0x968f4bc6 start_wqthread + 30 Thread 3: com.apple.CFSocket.private 0 libSystem.B.dylib 0x968ee0c6 select$DARWIN_EXTSN + 10 1 com.apple.CoreFoundation 0x92211c83 __CFSocketManager + 1091 2 libSystem.B.dylib 0x968fc85d _pthread_start + 345 3 libSystem.B.dylib 0x968fc6e2 thread_start + 34 Thread 4: 0 libSystem.B.dylib 0x968f4a12 __workq_kernreturn + 10 1 libSystem.B.dylib 0x968f4fa8 _pthread_wqthread + 941 2 libSystem.B.dylib 0x968f4bc6 start_wqthread + 30 Thread 5: 0 libSystem.B.dylib 0x968cf15a semaphore_timedwait_signal_trap + 10 1 libSystem.B.dylib 0x968fcce5 _pthread_cond_wait + 1066 2 libSystem.B.dylib 0x9692bac8 pthread_cond_timedwait_relative_np + 47 3 ...apple.AddressBook.framework 0x9310043f -[ABRemoteImageLoader workLoop] + 283 4 com.apple.Foundation 0x97822bf0 -[NSThread main] + 45 5 com.apple.Foundation 0x97822ba0 __NSThread__main__ + 1499 6 libSystem.B.dylib 0x968fc85d _pthread_start + 345 7 libSystem.B.dylib 0x968fc6e2 thread_start + 34 Thread 6: 0 libSystem.B.dylib 0x968cf0fa mach_msg_trap + 10 1 libSystem.B.dylib 0x968cf867 mach_msg + 68 2 com.apple.CoreFoundation 0x921d237f __CFRunLoopRun + 2079 3 com.apple.CoreFoundation 0x921d1464 CFRunLoopRunSpecific + 452 4 com.apple.CoreFoundation 0x921d1291 CFRunLoopRunInMode + 97 5 com.apple.Foundation 0x9785b7d0 +[NSURLConnection(NSURLConnectionReallyInternal) _resourceLoadLoop:] + 329 6 com.apple.Foundation 0x97822bf0 -[NSThread main] + 45 7 com.apple.Foundation 0x97822ba0 __NSThread__main__ + 1499 8 libSystem.B.dylib 0x968fc85d _pthread_start + 345 9 libSystem.B.dylib 0x968fc6e2 thread_start + 34 Thread 0 crashed with X86 Thread State (32-bit): eax: 0x00000670 ebx: 0x9972272e ecx: 0x00000000 edx: 0x00000002 edi: 0xa0c3b214 esi: 0x00000004 ebp: 0xbfffe6c8 esp: 0xbfffe660 ss: 0x0000001f efl: 0x00010202 eip: 0x99722b6d cs: 0x00000017 ds: 0x0000001f es: 0x0000001f fs: 0x00000000 gs: 0x00000037 cr2: 0x00000000 Binary Images: 0x1000 - 0x1a0ff7 +com.adiumX.adiumX 1.4.1 (1.4.1) <136586E8-F3F5-99ED-DB1F-48C0027686CB> /Applications/Adium.app/Contents/MacOS/Adium 0x1f9000 - 0x23efe7 +AIUtilities ??? (???) <565A1BC2-4B50-6277-D127-AFBF01F90CE3> /Applications/Adium.app/Contents/Frameworks/AIUtilities.framework/Versions/A/AIUtilities 0x2af000 - 0x307ff7 +com.adiumX.AdiumPurple ??? (1.0) <F4C2A8E4-695E-7CCE-41F3-F8960AFDC149> /Applications/Adium.app/Contents/Frameworks/AdiumLibpurple.framework/Versions/A/AdiumLibpurple 0x34b000 - 0x3ddfe7 +Adium ??? (???) <774A171B-ED45-D221-6A37-486AA15C8BA5> /Applications/Adium.app/Contents/Frameworks/Adium.framework/Versions/A/Adium 0x439000 - 0x510ff7 +com.googlepages.openspecies.rtool.libglib 2.0.0 (2.0.0) <C620AA58-CFC4-855E-1F2F-F84D9335CD5D> /Applications/Adium.app/Contents/Frameworks/libglib.framework/Versions/2.0.0/libglib 0x53d000 - 0x53eff7 +com.googlepages.openspecies.rtool.libgmodule 2.0.0 (2.0.0) <11FF9396-454A-394B-1B12-D84AD535F6F6> /Applications/Adium.app/Contents/Frameworks/libgmodule.framework/Versions/2.0.0/libgmodule 0x542000 - 0x578fe7 +com.googlepages.openspecies.rtool.libgobject 2.0.0 (2.0.0) <D69FB8D0-D271-EC20-42DD-04FCC65A72BF> /Applications/Adium.app/Contents/Frameworks/libgobject.framework/Versions/2.0.0/libgobject 0x58e000 - 0x590ff7 +com.googlepages.openspecies.rtool.libgthread 2.0.0 (2.0.0) <5D4B8DC6-28E3-9285-8E2A-2D7A3CBE11C5> /Applications/Adium.app/Contents/Frameworks/libgthread.framework/Versions/2.0.0/libgthread 0x594000 - 0x59eff7 +com.googlepages.openspecies.rtool.libintl 8 (8) <343C9F94-8840-4465-64E4-86A0092AD69F> /Applications/Adium.app/Contents/Frameworks/libintl.framework/Versions/8/libintl 0x5a3000 - 0x5cbff7 +com.googlepages.openspecies.rtool.libmeanwhile 1 (1) <7B341D44-FA86-F7C3-E800-7D1169EB9CE2> /Applications/Adium.app/Contents/Frameworks/libmeanwhile.framework/Versions/1/libmeanwhile 0x5e0000 - 0x81bff7 +libpurple 8.5.0 (compatibility 8.0.0) <DEB5CE6C-2A4A-16CA-E0EF-DDE812865406> /Applications/Adium.app/Contents/Frameworks/libpurple.framework/Versions/0/libpurple 0x8c3000 - 0x978fe7 libcrypto.0.9.7.dylib 0.9.7 (compatibility 0.9.7) <AACC86C0-86B4-B1A7-003F-2A0AF68973A2> /usr/lib/libcrypto.0.9.7.dylib 0x9be000 - 0x9ccff7 +com.dpompa.fribidi ??? (1.0) <EA8AEBCF-DFE5-85FB-5C0E-EB3AB5B0A950> /Applications/Adium.app/Contents/Frameworks/FriBidi.framework/Versions/A/FriBidi 0x21d2000 - 0x21e1fe7 +AutoHyperlinks ??? (???) <A8B5F9E1-E259-F33F-9E60-F4E37B1ED500> /Applications/Adium.app/Contents/Frameworks/AutoHyperlinks.framework/Versions/A/AutoHyperlinks 0x21e7000 - 0x21f3ff7 +net.brockerhoff.RBSplitView.Framework 1.1.4 (1.1.4) <D92691AA-294F-A85D-E7E1-01AD0A0717D2> /Applications/Adium.app/Contents/Frameworks/RBSplitView.framework/Versions/A/RBSplitView 0x21fb000 - 0x220efff +org.andymatuschak.Sparkle 1.5 Beta (bzr) (340) <E0109DBE-F614-66D0-9B82-6151BC40DAD7> /Applications/Adium.app/Contents/Frameworks/Sparkle.framework/Versions/A/Sparkle 0x221c000 - 0x226cfef +com.adiumX.OTR ??? (1.0) <BAE9D6BD-60D5-B53B-19BC-C17287F55EE9> /Applications/Adium.app/Contents/Frameworks/OTR.framework/Versions/A/OTR 0x227d000 - 0x2280ff7 +org.boredzo.LMX ??? (1.0) <92632179-5CFB-EA6B-AAE7-5F4B98BF0CD9> /Applications/Adium.app/Contents/Frameworks/LMX.framework/Versions/A/LMX 0x2286000 - 0x228dff1 +net.oauth.OAuthConsumer ??? (0.1.1) <025882EC-04DA-763B-18F5-5266A5D185FD> /Applications/Adium.app/Contents/Frameworks/OAuthConsumer.framework/Versions/A/OAuthConsumer 0x2296000 - 0x22a6fe7 +com.googlepages.openspecies.rtool.libjson-glib 1.0.0 (1.0.0) <016CAFB1-DD85-3C9D-411C-C696D9D57213> /Applications/Adium.app/Contents/Frameworks/libjson-glib.framework/Versions/1.0.0/libjson-glib 0x2784000 - 0x2788ff3 com.apple.audio.AudioIPCPlugIn 1.1.6 (1.1.6) <F402CF88-D96C-42A0-3207-49759F496AE8> /System/Library/Extensions/AudioIPCDriver.kext/Contents/Resources/AudioIPCPlugIn.bundle/Contents/MacOS/AudioIPCPlugIn 0x278d000 - 0x2793ffb com.apple.audio.AppleHDAHALPlugIn 1.9.9 (1.9.9f12) <82BFF5E9-2B0E-FE8B-8370-445DD94DA434> /System/Library/Extensions/AppleHDA.kext/Contents/PlugIns/AppleHDAHALPlugIn.bundle/Contents/MacOS/AppleHDAHALPlugIn 0x15fda000 - 0x15fdcff7 apop.so ??? (???) <B365DF5B-6A00-9595-27FF-4811B12B1C19> /usr/lib/sasl2/apop.so 0x15fe0000 - 0x15fe9ff7 digestmd5WebDAV.so ??? (???) <FC8C0A3E-1BC3-5016-95E1-E7EF9FF37242> /usr/lib/sasl2/digestmd5WebDAV.so 0x15fee000 - 0x15ff0ff7 libanonymous.2.so ??? (???) <41A1E196-0AB4-1ADD-6362-BB53A0E57ABA> /usr/lib/sasl2/libanonymous.2.so 0x15ff4000 - 0x15ff6ff7 libcrammd5.2.so ??? (???) <032F08C3-2D26-F956-4799-1012A1BBCB71> /usr/lib/sasl2/libcrammd5.2.so 0x15ffa000 - 0x15ffcff7 login.so ??? (???) <4E0B45F7-243E-A3FD-AA75-EF653590BF17> /usr/lib/sasl2/login.so 0x16100000 - 0x16116ff7 dhx.so ??? (???) <B50D8278-4246-4086-E0AF-3CBE96AE9837> /usr/lib/sasl2/dhx.so 0x16123000 - 0x1612bff7 libdigestmd5.2.so ??? (???) <E8D78B02-D51C-F2CB-C4BA-AC9231ED8006> /usr/lib/sasl2/libdigestmd5.2.so 0x16130000 - 0x16135ff7 libgssapiv2.2.so ??? (???) <193995B9-1C15-BEB2-40B7-1598D82F29BB> /usr/lib/sasl2/libgssapiv2.2.so 0x1613a000 - 0x1613fff7 libntlm.so ??? (???) <F97C955D-E521-216F-E8F0-79E8C907217A> /usr/lib/sasl2/libntlm.so 0x16144000 - 0x1614bff7 libotp.2.so ??? (???) <3DF61F7F-4929-A37D-01CB-9A7A90E3B9B7> /usr/lib/sasl2/libotp.2.so 0x16152000 - 0x16154ff7 libplain.2.so ??? (???) <5CC9D89A-9656-EEE8-64AB-E61A22FA8465> /usr/lib/sasl2/libplain.2.so 0x16158000 - 0x1615cff7 libpps.so ??? (???) <C5A25A99-412E-AD7F-D6FD-C4CC07B7B2A5> /usr/lib/sasl2/libpps.so 0x16161000 - 0x16164ff7 mschapv2.so ??? (???) <34DFB657-5E2E-5B83-713B-F57ACFB1E091> /usr/lib/sasl2/mschapv2.so 0x16169000 - 0x1616bff7 shadow_auxprop.so ??? (???) <4073854F-B4C8-A0D4-C0FF-7A0C93BFC70E> /usr/lib/sasl2/shadow_auxprop.so 0x16170000 - 0x16172ff7 smb_lm.so ??? (???) <4B7A54D8-241D-CC8C-8759-4C7DC562369D> /usr/lib/sasl2/smb_lm.so 0x16177000 - 0x1617aff7 smb_nt.so ??? (???) <7B7D31B1-10A1-1AE9-E323-C19A3C52DC03> /usr/lib/sasl2/smb_nt.so 0x1617f000 - 0x16182ff7 smb_ntlmv2.so ??? (???) <3BFE5AA9-F215-36B5-E7D7-46BE1BFD63EA> /usr/lib/sasl2/smb_ntlmv2.so 0x194c1000 - 0x19639fe7 GLEngine ??? (???) <A4BBE58C-1211-0473-7B78-C3BA7AC29C9B> /System/Library/Frameworks/OpenGL.framework/Resources/GLEngine.bundle/GLEngine 0x1966b000 - 0x19a70fe7 libclh.dylib 3.1.1 C (3.1.1) <D1A3D8AD-0FED-4AD2-AB43-CF804B7BDBF9> /System/Library/Extensions/GeForceGLDriver.bundle/Contents/MacOS/libclh.dylib 0x19ae8000 - 0x19b0cfe7 GLRendererFloat ??? (???) <EFE5EC6D-74B2-37A2-92E4-526A2EF6B791> /System/Library/Frameworks/OpenGL.framework/Resources/GLRendererFloat.bundle/GLRendererFloat 0x8f0c8000 - 0x8f811ff7 com.apple.GeForceGLDriver 1.6.24 (6.2.4) <DCC16E52-B1F1-90E6-E839-D30DF4CBA468> /System/Library/Extensions/GeForceGLDriver.bundle/Contents/MacOS/GeForceGLDriver 0x8fe00000 - 0x8fe4162b dyld 132.1 (???) <39AC3185-E633-68AA-7CD6-1230E7F1CEF4> /usr/lib/dyld 0x90003000 - 0x90005fe7 com.apple.ExceptionHandling 1.5 (10) <03218275-EBEC-39AA-895A-BA72A5FDBB7A> /System/Library/Frameworks/ExceptionHandling.framework/Versions/A/ExceptionHandling 0x90006000 - 0x90074ff7 com.apple.QuickLookUIFramework 2.3 (327.6) <74706A08-5399-24FE-00B2-4A702A6B83C1> /System/Library/Frameworks/Quartz.framework/Versions/A/Frameworks/QuickLookUI.framework/Versions/A/QuickLookUI 0x90075000 - 0x900b2ff7 com.apple.SystemConfiguration 1.10.5 (1.10.2) <362DF639-6E5F-9371-9B99-81C581A8EE41> /System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration 0x900fa000 - 0x901d5feb com.apple.DesktopServices 1.5.9 (1.5.9) <CED00AC1-924B-0E45-7D5E-1CEA8929F5BE> /System/Library/PrivateFrameworks/DesktopServicesPriv.framework/Versions/A/DesktopServicesPriv 0x901d6000 - 0x9021aff3 com.apple.coreui 2 (114) <2234855E-3BED-717F-0BFA-D1A289ECDBDA> /System/Library/PrivateFrameworks/CoreUI.framework/Versions/A/CoreUI 0x9021b000 - 0x9021bff7 com.apple.CoreServices 44 (44) <51CFA89A-33DB-90ED-26A8-67D461718A4A> /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices 0x90258000 - 0x90302fe7 com.apple.CFNetwork 454.11.5 (454.11.5) <D8963574-285A-3BD6-6B25-07D39C6F67A4> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CFNetwork.framework/Versions/A/CFNetwork 0x90303000 - 0x9033efeb libFontRegistry.dylib ??? (???) <4FB144ED-8AF9-27CF-B315-DCE5575D5231> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontRegistry.dylib 0x90342000 - 0x90366ff7 libJPEG.dylib ??? (???) <46AF3A0F-2B8D-87B9-62D4-0905678A64DA> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/Resources/libJPEG.dylib 0x90367000 - 0x9036afe7 libmathCommon.A.dylib 315.0.0 (compatibility 1.0.0) <1622A54F-1A98-2CBE-B6A4-2122981A500E> /usr/lib/system/libmathCommon.A.dylib 0x9036b000 - 0x903d5fe7 libstdc++.6.dylib 7.9.0 (compatibility 7.0.0) <411D87F4-B7E1-44EB-F201-F8B4F9227213> /usr/lib/libstdc++.6.dylib 0x903d6000 - 0x903daff7 IOSurface ??? (???) <D849E1A5-6B0C-2A05-2765-850EC39BA2FF> /System/Library/Frameworks/IOSurface.framework/Versions/A/IOSurface 0x903db000 - 0x903edff7 com.apple.MultitouchSupport.framework 207.10 (207.10) <E1A6F663-570B-CE54-0F8A-BBCCDECE3B42> /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/MultitouchSupport 0x90437000 - 0x90470ff7 libcups.2.dylib 2.8.0 (compatibility 2.0.0) <D6F24434-8217-DF72-2126-1953080680D7> /usr/lib/libcups.2.dylib 0x9049b000 - 0x904ccff7 libGLImage.dylib ??? (???) <78F59EAB-BBD4-7366-CA84-970547501978> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLImage.dylib 0x904ec000 - 0x909a5ffb com.apple.VideoToolbox 0.484.20 (484.20) <E7B9F015-2569-43D7-5268-375ED937ECA5> /System/Library/PrivateFrameworks/VideoToolbox.framework/Versions/A/VideoToolbox 0x909a6000 - 0x90ab5fe7 com.apple.WebKit 6533.19 (6533.19.4) <A942073C-83DF-33ED-3D01-A24DE8AEAB3D> /System/Library/Frameworks/WebKit.framework/Versions/A/WebKit 0x90ab6000 - 0x90ae6ff7 com.apple.MeshKit 1.1 (49.2) <5A74D1A4-4B97-FE39-4F4D-E0B80F0ADD87> /System/Library/PrivateFrameworks/MeshKit.framework/Versions/A/MeshKit 0x90cc5000 - 0x90cfdff7 com.apple.LDAPFramework 2.0 (120.1) <131ED804-DD88-D84F-13F8-D48E0012B96F> /System/Library/Frameworks/LDAP.framework/Versions/A/LDAP 0x90cfe000 - 0x90f61fef com.apple.security 6.1.1 (37594) <1949216A-7583-B73A-6112-4D55CA5852E3> /System/Library/Frameworks/Security.framework/Versions/A/Security 0x90f62000 - 0x90f64ff7 com.apple.securityhi 4.0 (36638) <E7D83480-77BB-72F9-72F3-AEE198CE589F> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SecurityHI.framework/Versions/A/SecurityHI 0x90f65000 - 0x910e7fe7 libicucore.A.dylib 40.0.0 (compatibility 1.0.0) <35DB7644-0780-D2AB-F6A9-45F28D2D434A> /usr/lib/libicucore.A.dylib 0x910e8000 - 0x91217fe3 com.apple.audio.toolbox.AudioToolbox 1.6.5 (1.6.5) <0A0F68E5-4806-DB51-764B-D97554B801AD> /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox 0x91218000 - 0x91538ff3 com.apple.CoreServices.CarbonCore 861.23 (861.23) <B08756E4-32C5-CC33-0268-7C00A5ED7537> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/CarbonCore 0x91539000 - 0x91578ff7 com.apple.ImageCaptureCore 1.0.3 (1.0.3) <7E02D104-F31C-CF72-71B4-DA5DF7B48337> /System/Library/Frameworks/ImageCaptureCore.framework/Versions/A/ImageCaptureCore 0x91579000 - 0x915b0fe7 libssl.0.9.8.dylib 0.9.8 (compatibility 0.9.8) <7DCB5938-3140-E71A-92BD-8C242F30C8F5> /usr/lib/libssl.0.9.8.dylib 0x915c4000 - 0x9163ffff com.apple.AppleVAFramework 4.10.12 (4.10.12) <89C4EBE2-FE27-3160-0BD1-D0C2ED5F3605> /System/Library/PrivateFrameworks/AppleVA.framework/Versions/A/AppleVA 0x91640000 - 0x91742fef com.apple.MeshKitIO 1.1 (49.2) <D0401AC5-1F92-2BBB-EBAB-58EDD3BA61B9> /System/Library/PrivateFrameworks/MeshKit.framework/Versions/A/Frameworks/MeshKitIO.framework/Versions/A/MeshKitIO 0x91743000 - 0x91744ff7 com.apple.audio.units.AudioUnit 1.6.5 (1.6.5) <BE4C2495-B758-AD22-DCC0-56A6791E948E> /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit 0x91769000 - 0x91777fe7 libz.1.dylib 1.2.3 (compatibility 1.0.0) <33C1B260-ED05-945D-FC33-EF56EC791E2E> /usr/lib/libz.1.dylib 0x91778000 - 0x9179afef com.apple.DirectoryService.Framework 3.6 (621.9) <F2EEE9D7-D4FB-14F3-E647-ABD32754F557> /System/Library/Frameworks/DirectoryService.framework/Versions/A/DirectoryService 0x9184c000 - 0x9212cff7 com.apple.AppKit 6.6.7 (1038.35) <ABC7783C-E4D5-B848-BED6-99451D94D120> /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit 0x9212d000 - 0x9218efe7 com.apple.CoreText 3.5.0 (???) <BB50C045-25F5-65B8-B1DB-8CDAEF45EB46> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/CoreText.framework/Versions/A/CoreText 0x9218f000 - 0x92194ff7 com.apple.OpenDirectory 10.6 (10.6) <C1B46982-7D3B-3CC4-3BC2-3E4B595F0231> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/OpenDirectory 0x92195000 - 0x92310fe7 com.apple.CoreFoundation 6.6.4 (550.42) <C78D5079-663E-9734-7AFA-6CE79A0539F1> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation 0x92311000 - 0x92351ff3 com.apple.securityinterface 4.0.1 (37214) <43CE8A8D-64E5-F36E-4900-FBB1BD6557F1> /System/Library/Frameworks/SecurityInterface.framework/Versions/A/SecurityInterface 0x92352000 - 0x92355ff7 libCGXType.A.dylib 545.0.0 (compatibility 64.0.0) <B624AACE-991B-0FFA-2482-E69970576CE1> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/CoreGraphics.framework/Versions/A/Resources/libCGXType.A.dylib 0x924b9000 - 0x925fcfef com.apple.QTKit 7.6.6 (1756) <4D809734-4E1B-8E18-C825-86C5422FC3DC> /System/Library/Frameworks/QTKit.framework/Versions/A/QTKit 0x925fd000 - 0x9260dff7 libsasl2.2.dylib 3.15.0 (compatibility 3.0.0) <E276514D-394B-2FDD-6264-07A444AA6A4E> /usr/lib/libsasl2.2.dylib 0x92615000 - 0x92618ff7 libCoreVMClient.dylib ??? (???) <1F738E81-BB71-32C5-F1E9-C1302F71021C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libCoreVMClient.dylib 0x9262c000 - 0x92652ffb com.apple.DictionaryServices 1.1.2 (1.1.2) <43E1D565-6E01-3681-F2E5-72AE4C3A097A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/DictionaryServices.framework/Versions/A/DictionaryServices 0x9294d000 - 0x9295dff7 com.apple.DSObjCWrappers.Framework 10.6 (134) <95DC4010-ECC4-3A75-5DEE-11BB2AE895EE> /System/Library/PrivateFrameworks/DSObjCWrappers.framework/Versions/A/DSObjCWrappers 0x9295e000 - 0x929a0ff7 libvDSP.dylib 268.0.1 (compatibility 1.0.0) <8A4721DE-25C4-C8AA-EA90-9DA7812E3EBA> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib 0x929a1000 - 0x929abfe7 com.apple.audio.SoundManager 3.9.3 (3.9.3) <DE0E0EF6-8190-3F65-6BDD-5AC9D8A025D6> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/CarbonSound.framework/Versions/A/CarbonSound 0x929ac000 - 0x92a5cff3 com.apple.ColorSync 4.6.3 (4.6.3) <0354B408-665F-8B3F-87FF-64E6322276F0> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ColorSync.framework/Versions/A/ColorSync 0x92a5d000 - 0x92c3ffff com.apple.imageKit 2.0.3 (1.0) <B4DB05F7-01C5-35EE-7AB9-41BD9D63F075> /System/Library/Frameworks/Quartz.framework/Versions/A/Frameworks/ImageKit.framework/Versions/A/ImageKit 0x92cf5000 - 0x92d4bff7 com.apple.MeshKitRuntime 1.1 (49.2) <CB9F38B1-E107-EA62-EDFF-02EE79F6D1A5> /System/Library/PrivateFrameworks/MeshKit.framework/Versions/A/Frameworks/MeshKitRuntime.framework/Versions/A/MeshKitRuntime 0x92d4c000 - 0x92d55ff7 com.apple.DiskArbitration 2.3 (2.3) <6AA6DDF6-AFC3-BBDB-751A-64AE3580A49E> /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration 0x92dc6000 - 0x92df8fe3 libTrueTypeScaler.dylib ??? (???) <6E9D1A50-330E-F1F4-F93D-9ECC8A61B21A> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libTrueTypeScaler.dylib 0x92e7a000 - 0x92e88ff7 com.apple.opengl 1.6.11 (1.6.11) <286D1BC4-4CD8-3CD4-F723-5C196FE15FE0> /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL 0x92e89000 - 0x92ec7ff7 com.apple.QuickLookFramework 2.3 (327.6) <66955C29-0C99-D02C-DB18-4952AFB4E886> /System/Library/Frameworks/QuickLook.framework/Versions/A/QuickLook 0x92ec8000 - 0x92f92fef com.apple.CoreServices.OSServices 357 (357) <3A26F553-722D-3536-EEDE-FB41FCDAA7FD> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/OSServices.framework/Versions/A/OSServices 0x92f93000 - 0x92f97ff7 libGIF.dylib ??? (???) <DA5758A4-71B0-DD6E-7402-B7FB15387569> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/Resources/libGIF.dylib 0x92f98000 - 0x93099fe7 libxml2.2.dylib 10.3.0 (compatibility 10.0.0) <ED8E45C6-B078-15E8-938D-99D8FD1EAE64> /usr/lib/libxml2.2.dylib 0x9309a000 - 0x932a1feb com.apple.AddressBook.framework 5.0.3 (875) <759B660B-00F6-F08C-37CD-69468C774B5E> /System/Library/Frameworks/AddressBook.framework/Versions/A/AddressBook 0x932a2000 - 0x933aeff7 libGLProgrammability.dylib ??? (???) <8B308FAE-843F-EE76-0254-3374CBFFA7B3> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLProgrammability.dylib 0x933bf000 - 0x933c2ffb com.apple.help 1.3.1 (41) <6A5AD406-9D8E-5BAC-51E1-E09AB9A6D159> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Help.framework/Versions/A/Help 0x933c3000 - 0x9372eff7 com.apple.QuartzCore 1.6.3 (227.34) <CC1C1631-D8D1-D416-171E-A1683274E479> /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore 0x9372f000 - 0x93740ff7 com.apple.LangAnalysis 1.6.6 (1.6.6) <3036AD83-4F1D-1028-54EE-54165E562650> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/LangAnalysis.framework/Versions/A/LangAnalysis 0x93741000 - 0x93755ffb com.apple.speech.synthesis.framework 3.10.35 (3.10.35) <9F5CE4F7-D05C-8C14-4B76-E43D07A8A680> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis 0x93790000 - 0x937e1ff7 com.apple.HIServices 1.8.1 (???) <51BDD848-32A5-2425-BE07-BD037A89630A> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices 0x937e2000 - 0x937e2ff7 liblangid.dylib ??? (???) <FCC37057-CDD7-2AF1-21AF-52A06C4048FF> /usr/lib/liblangid.dylib 0x937e3000 - 0x93a0eff3 com.apple.QuartzComposer 4.2 ({156.28}) <08AF01DC-110D-9443-3916-699DBDED0149> /System/Library/Frameworks/Quartz.framework/Versions/A/Frameworks/QuartzComposer.framework/Versions/A/QuartzComposer 0x93a17000 - 0x93a21ffb com.apple.speech.recognition.framework 3.11.1 (3.11.1) <7486003F-8FDB-BD6C-CB34-DE45315BD82C> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SpeechRecognition.framework/Versions/A/SpeechRecognition 0x93a22000 - 0x93a28fff com.apple.CommonPanels 1.2.4 (91) <CE92759E-865E-8A3B-1488-ECD497E4074D> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/CommonPanels.framework/Versions/A/CommonPanels 0x93b47000 - 0x93b94feb com.apple.DirectoryService.PasswordServerFramework 6.0 (6.0) <27F3FF53-F818-9836-2101-3E963FE0C0E0> /System/Library/PrivateFrameworks/PasswordServer.framework/Versions/A/PasswordServer 0x93b95000 - 0x93c30ff7 com.apple.ApplicationServices.ATS 4.4 (???) <ECB16606-4DF8-4AFB-C91D-F7947C26040F> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/ATS 0x93c31000 - 0x93c31ff7 com.apple.quartzframework 1.5 (1.5) <7DD4EBF1-60C4-9329-08EF-6E59731D9430> /System/Library/Frameworks/Quartz.framework/Versions/A/Quartz 0x93d51000 - 0x93d72fe7 com.apple.opencl 12.3 (12.3) <DEA600BF-4F54-66B5-DB2F-DC57FD518543> /System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL 0x93d73000 - 0x93e77fe7 libcrypto.0.9.8.dylib 0.9.8 (compatibility 0.9.8) <BDEFA030-5E75-7C47-2904-85AB16937F45> /usr/lib/libcrypto.0.9.8.dylib 0x93e78000 - 0x93ef8feb com.apple.SearchKit 1.3.0 (1.3.0) <7AE32A31-2B8E-E271-C03A-7A0F7BAFC85C> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SearchKit.framework/Versions/A/SearchKit 0x93ef9000 - 0x93f68ff7 libvMisc.dylib 268.0.1 (compatibility 1.0.0) <595A5539-9F54-63E6-7AAC-C04E1574B050> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib 0x93f69000 - 0x94122feb com.apple.ImageIO.framework 3.0.4 (3.0.4) <C145139E-24C4-5A3D-B17C-809D528354B2> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/ImageIO 0x94123000 - 0x94166ff7 com.apple.NavigationServices 3.5.4 (182) <8DC6FD4A-6C74-9C23-A4C3-715B44A8D28C> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/NavigationServices.framework/Versions/A/NavigationServices 0x941a0000 - 0x941fafe7 com.apple.CorePDF 1.3 (1.3) <EA168671-F44F-BFE4-AA7D-3801DA29A650> /System/Library/PrivateFrameworks/CorePDF.framework/Versions/A/CorePDF 0x941fb000 - 0x94258ff7 com.apple.framework.IOKit 2.0 (???) <A769737F-E0D6-FB06-29B4-915CF4F43420> /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit 0x94259000 - 0x9425dff7 libGFXShared.dylib ??? (???) <C3A805C4-C0E5-B300-430A-7E811395CB8E> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGFXShared.dylib 0x942ef000 - 0x9439dff3 com.apple.ink.framework 1.3.3 (107) <233A981E-A2F9-56FB-8BDE-C2DEC3F20784> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Ink.framework/Versions/A/Ink 0x9439e000 - 0x94b8d557 com.apple.CoreGraphics 1.545.0 (???) <1AB39678-00D5-FB88-3B41-93D78348E0DE> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics 0x94b8e000 - 0x94bd7fe7 libTIFF.dylib ??? (???) <AC1FC806-F7F4-174B-375F-FE5D6008666C> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ImageIO.framework/Versions/A/Resources/libTIFF.dylib 0x94e21000 - 0x94f4ffe7 com.apple.CoreData 102.1 (251) <87FE6861-F2D6-773D-ED45-345272E56463> /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData 0x95ea3000 - 0x95ee6ff7 libGLU.dylib ??? (???) <F8580594-0B38-F3ED-A715-CB3776B747A0> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLU.dylib 0x95eef000 - 0x95f71ffb SecurityFoundation ??? (???) <A8D248DE-8670-970D-39E3-A9738CFDBEE1> /System/Library/Frameworks/SecurityFoundation.framework/Versions/A/SecurityFoundation 0x95f72000 - 0x95f7fff7 com.apple.NetFS 3.2.1 (3.2.1) <94A52A6D-F071-09D7-E80F-F633F17233FE> /System/Library/Frameworks/NetFS.framework/Versions/A/NetFS 0x95f80000 - 0x95f82ff7 libRadiance.dylib ??? (???) <10048B4A-2AE8-A4E2-21B8-C6E7A8C5B76F> snip... Superuser character limit :-(

    Read the article

  • Powershell script to change screen Orientation

    - by user161964
    I wrote a script to change Primary screen orientation to portrait. my screen is 1920X1200 It runs and no error reported. But the screen does not rotated as i expected. The code was modified from Set-ScreenResolution (Andy Schneider) Does anybody can help me take a look? some reference site: 1.set-screenresolution http://gallery.technet.microsoft.com/ScriptCenter/2a631d72-206d-4036-a3f2-2e150f297515/ 2.C code for change oridentation (MSDN) Changing Screen Orientation Programmatically http://msdn.microsoft.com/en-us/library/ms812499.aspx my code as below: Function Set-ScreenOrientation { $pinvokeCode = @" using System; using System.Runtime.InteropServices; namespace Resolution { [StructLayout(LayoutKind.Sequential)] public struct DEVMODE1 { [MarshalAs(UnmanagedType.ByValTStr, SizeConst = 32)] public string dmDeviceName; public short dmSpecVersion; public short dmDriverVersion; public short dmSize; public short dmDriverExtra; public int dmFields; public short dmOrientation; public short dmPaperSize; public short dmPaperLength; public short dmPaperWidth; public short dmScale; public short dmCopies; public short dmDefaultSource; public short dmPrintQuality; public short dmColor; public short dmDuplex; public short dmYResolution; public short dmTTOption; public short dmCollate; [MarshalAs(UnmanagedType.ByValTStr, SizeConst = 32)] public string dmFormName; [MarshalAs(UnmanagedType.U4)] public short dmDisplayOrientation public short dmLogPixels; public short dmBitsPerPel; public int dmPelsWidth; public int dmPelsHeight; public int dmDisplayFlags; public int dmDisplayFrequency; public int dmICMMethod; public int dmICMIntent; public int dmMediaType; public int dmDitherType; public int dmReserved1; public int dmReserved2; public int dmPanningWidth; public int dmPanningHeight; }; class User_32 { [DllImport("user32.dll")] public static extern int EnumDisplaySettings(string deviceName, int modeNum, ref DEVMODE1 devMode); [DllImport("user32.dll")] public static extern int ChangeDisplaySettings(ref DEVMODE1 devMode, int flags); public const int ENUM_CURRENT_SETTINGS = -1; public const int CDS_UPDATEREGISTRY = 0x01; public const int CDS_TEST = 0x02; public const int DISP_CHANGE_SUCCESSFUL = 0; public const int DISP_CHANGE_RESTART = 1; public const int DISP_CHANGE_FAILED = -1; } public class PrmaryScreenOrientation { static public string ChangeOrientation() { DEVMODE1 dm = GetDevMode1(); if (0 != User_32.EnumDisplaySettings(null, User_32.ENUM_CURRENT_SETTINGS, ref dm)) { dm.dmDisplayOrientation = DMDO_90 dm.dmPelsWidth = 1200; dm.dmPelsHeight = 1920; int iRet = User_32.ChangeDisplaySettings(ref dm, User_32.CDS_TEST); if (iRet == User_32.DISP_CHANGE_FAILED) { return "Unable To Process Your Request. Sorry For This Inconvenience."; } else { iRet = User_32.ChangeDisplaySettings(ref dm, User_32.CDS_UPDATEREGISTRY); switch (iRet) { case User_32.DISP_CHANGE_SUCCESSFUL: { return "Success"; } case User_32.DISP_CHANGE_RESTART: { return "You Need To Reboot For The Change To Happen.\n If You Feel Any Problem After Rebooting Your Machine\nThen Try To Change Resolution In Safe Mode."; } default: { return "Failed"; } } } } else { return "Failed To Change."; } } private static DEVMODE1 GetDevMode1() { DEVMODE1 dm = new DEVMODE1(); dm.dmDeviceName = new String(new char[32]); dm.dmFormName = new String(new char[32]); dm.dmSize = (short)Marshal.SizeOf(dm); return dm; } } } "@ Add-Type $pinvokeCode -ErrorAction SilentlyContinue [Resolution.PrmaryScreenOrientation]::ChangeOrientation() }

    Read the article

  • Why can't I convert FLV to MP4 format using FFmpeg when MP3 works?

    - by hugemeow
    In fact I have succeeded to convert FLV to MP3: D:\tmp\ffmpeg-20121005-git-d9dfe9a-win64-static\ffmpeg-20121005-git-d9dfe9a-win 4-static\bin>ffmpeg.exe -i a.flv -acodec mp3 a.mp3 ffmpeg version N-45080-gd9dfe9a Copyright (c) 2000-2012 the FFmpeg developers built on Oct 5 2012 16:49:01 with gcc 4.7.1 (GCC) configuration: --enable-gpl --enable-version3 --disable-pthreads --enable-run ime-cpudetect --enable-avisynth --enable-bzlib --enable-frei0r --enable-libass -enable-libcelt --enable-libopencore-amrnb --enable-libopencore-amrwb --enable- ibfreetype --enable-libgsm --enable-libmp3lame --enable-libnut --enable-libopen peg --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libthe ra --enable-libutvideo --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-l bvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --en ble-zlib libavutil 51. 73.102 / 51. 73.102 libavcodec 54. 63.100 / 54. 63.100 libavformat 54. 29.105 / 54. 29.105 libavdevice 54. 3.100 / 54. 3.100 libavfilter 3. 19.102 / 3. 19.102 libswscale 2. 1.101 / 2. 1.101 libswresample 0. 16.100 / 0. 16.100 libpostproc 52. 1.100 / 52. 1.100 Input #0, flv, from 'a.flv': Metadata: metadatacreator : iku hasKeyframes : true hasVideo : true hasAudio : true hasMetadata : true canSeekToEnd : false datasize : 16906383 videosize : 14558526 audiosize : 2270465 lasttimestamp : 530 lastkeyframetimestamp: 529 lastkeyframelocation: 16893721 Duration: 00:08:49.73, start: 0.000000, bitrate: 255 kb/s Stream #0:0: Video: h264 (Main), yuv420p, 448x336 [SAR 1:1 DAR 4:3], 218 kb s, 15 tbr, 1k tbn, 30 tbc Stream #0:1: Audio: aac, 44100 Hz, stereo, s16, 32 kb/s File 'a.mp3' already exists. Overwrite ? [y/N] y Output #0, mp3, to 'a.mp3': Metadata: metadatacreator : iku hasKeyframes : true hasVideo : true hasAudio : true hasMetadata : true canSeekToEnd : false datasize : 16906383 videosize : 14558526 audiosize : 2270465 lasttimestamp : 530 lastkeyframetimestamp: 529 lastkeyframelocation: 16893721 TSSE : Lavf54.29.105 Stream #0:0: Audio: mp3, 44100 Hz, stereo, s16 Stream mapping: Stream #0:1 -> #0:0 (aac -> libmp3lame) Press [q] to stop, [?] for help size= 8279kB time=00:08:49.78 bitrate= 128.0kbits/s video:0kB audio:8278kB subtitle:0 global headers:0kB muxing overhead 0.006842% But I failed to convert FLV to MP4. Why is the encoder 'mp4' unknown? What's more, how can I find the codecs which are already supported by my FFmpeg? D:\tmp\ffmpeg-20121005-git-d9dfe9a-win64-static\ffmpeg-20121005-git-d9dfe9a-win6 4-static\bin>ffmpeg.exe -i a.flv -acodec mp4 aa.mp4 ffmpeg version N-45080-gd9dfe9a Copyright (c) 2000-2012 the FFmpeg developers built on Oct 5 2012 16:49:01 with gcc 4.7.1 (GCC) configuration: --enable-gpl --enable-version3 --disable-pthreads --enable-runt ime-cpudetect --enable-avisynth --enable-bzlib --enable-frei0r --enable-libass - -enable-libcelt --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-l ibfreetype --enable-libgsm --enable-libmp3lame --enable-libnut --enable-libopenj peg --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheo ra --enable-libutvideo --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-li bvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --ena ble-zlib libavutil 51. 73.102 / 51. 73.102 libavcodec 54. 63.100 / 54. 63.100 libavformat 54. 29.105 / 54. 29.105 libavdevice 54. 3.100 / 54. 3.100 libavfilter 3. 19.102 / 3. 19.102 libswscale 2. 1.101 / 2. 1.101 libswresample 0. 16.100 / 0. 16.100 libpostproc 52. 1.100 / 52. 1.100 Input #0, flv, from 'a.flv': Metadata: metadatacreator : iku hasKeyframes : true hasVideo : true hasAudio : true hasMetadata : true canSeekToEnd : false datasize : 16906383 videosize : 14558526 audiosize : 2270465 lasttimestamp : 530 lastkeyframetimestamp: 529 lastkeyframelocation: 16893721 Duration: 00:08:49.73, start: 0.000000, bitrate: 255 kb/s Stream #0:0: Video: h264 (Main), yuv420p, 448x336 [SAR 1:1 DAR 4:3], 218 kb/ s, 15 tbr, 1k tbn, 30 tbc Stream #0:1: Audio: aac, 44100 Hz, stereo, s16, 32 kb/s Unknown encoder 'mp4'

    Read the article

  • VS 2012 Code Review &ndash; Before Check In OR After Check In?

    - by Tarun Arora
    “Is Code Review Important and Effective?” There is a consensus across the industry that code review is an effective and practical way to collar code inconsistency and possible defects early in the software development life cycle. Among others some of the advantages of code reviews are, Bugs are found faster Forces developers to write readable code (code that can be read without explanation or introduction!) Optimization methods/tricks/productive programs spread faster Programmers as specialists "evolve" faster It's fun “Code review is systematic examination (often known as peer review) of computer source code. It is intended to find and fix mistakes overlooked in the initial development phase, improving both the overall quality of software and the developers' skills. Reviews are done in various forms such as pair programming, informal walkthroughs, and formal inspections.” Wikipedia No where does the definition mention whether its better to review code before the code has been committed to version control or after the commit has been performed. No matter which side you favour, Visual Studio 2012 allows you to request for a code review both before check in and also request for a review after check in. Let’s weigh the pros and cons of the approaches independently. Code Review Before Check In or Code Review After Check In? Approach 1 – Code Review before Check in Developer completes the code and feels the code quality is appropriate for check in to TFS. The developer raises a code review request to have a second pair of eyes validate if the code abides to the recommended best practices, will not result in any defects due to common coding mistakes and whether any optimizations can be made to improve the code quality.                                             Image 1 – code review before check in Pros Everything that gets committed to source control is reviewed. Minimizes the chances of smelly code making its way into the code base. Decreases the cost of fixing bugs, remember, the earlier you find them, the lesser the pain in fixing them. Cons Development Code Freeze – Since the changes aren’t in the source control yet. Further development can only be done off-line. The changes have not been through a CI build, hard to say whether the code abides to all build quality standards. Inconsistent! Cumbersome to track the actual code review process.  Not every change to the code base is worth reviewing, a lot of effort is invested for very little gain. Approach 2 – Code Review after Check in Developer checks in, random code reviews are performed on the checked in code.                                                      Image 2 – Code review after check in Pros The code has already passed the CI build and run through any code analysis plug ins you may have running on the build server. Instruct the developer to ensure ZERO fx cop, style cop and static code analysis before check in. Code is cleaner and smell free even before the code review. No Offline development, developers can continue to develop against the source control. Cons Bad code can easily make its way into the code base. Since the review take place much later in the cycle, the cost of fixing issues can prove to be much higher. Approach 3 – Hybrid Approach The community advocates a more hybrid approach, a blend of tooling and human accountability quotient.                                                               Image 3 – Hybrid Approach 1. Code review high impact check ins. It is not possible to review everything, by setting up code review check in policies you can end up slowing your team. More over, the code that you are reviewing before check in hasn't even been through a green CI build either. 2. Tooling. Let the tooling work for you. By running static analysis, fx cop, style cop and other plug ins on the build agent, you can identify the real issues that in my opinion can't possibly be identified using human reviews. Configure the tooling to report back top 10 issues every day. Mandate the manual code review of individuals who keep making it to this list of shame more often. 3. During Merge. I would prefer eliminating some of the other code issues during merge from Main branch to the release branch. In a scrum project this is still easier because cheery picking the merges is a possibility and the size of code being reviewed is still limited. Let the tooling work for you, if some one breaks the CI build often, put them on a gated check in build course until you see improvement. If some one appears on the top 10 list of shame generated via the build then ensure that all their code is reviewed till you see improvement. At the end of the day, the goal is to ensure that the code being delivered is top quality. By enforcing a code review before any check in, you force the developer to work offline or stay put till the review is complete. What do the experts say? So I asked a few expects what they thought of “Code Review quality gate before Checking in code?" Terje Sandstrom | Microsoft ALM MVP You mean a review quality gate BEFORE checking in code????? That would mean a lot of code staying either local or in shelvesets, and not even been through a CI build, and a green CI build being the main criteria for going further, f.e. to the review state. I would not like code laying around with no checkin’s. Having a requirement that code is checked in small pieces, 4-8 hours work max, and AT LEAST daily checkins, a manual code review comes second down the lane. I would expect review quality gates to happen before merging back to main, or before merging to release.  But that would all be on checked-in code.  Branching is absolutely one way to ease the pain.   Another way we are using is automatic quality builds, running metrics, coverage, static code analysis.  Unfortunately it takes some time, would be great to be on CI’s – but…., so it’s done scheduled every night. Based on this we get, among other stuff,  top 10 lists of suspicious code, which is then subjected to reviews.  If a person seems to be very popular on these top 10 lists, we subject every check in from that person to a review for a period. That normally helps.   None of the clients I have can afford to have every checkin reviewed, so we need to find ways around it. I don’t disagree with the nicety of having all the code reviewed, but I find it hard to find those resources in today’s enterprises. David V. Corbin | Visual Studio ALM Ranger I tend to agree with both sides. I hate having code that is not checked in, but at the same time hate having “bad” code in the repository. I have found that branching is one approach to solving this dilemma. Code is checked into the private/feature branch before the review, but is not merged over to the “official” branch until after the review. I advocate both, depending on circumstance (especially team dynamics)   - The “pre-checkin” is usually for elements that may impact the project as a whole. Think of it as another “gate” along with passing unit tests. - The “post-checkin” may very well not be at the changeset level, but correlates to a review at the “user story” level.   Again, this depends on team dynamics in play…. Robert MacLean | Microsoft ALM MVP I do not think there is no right answer for the industry as a whole. In short the question is why do you do reviews? Your question implies risk mitigation, so in low risk areas you can get away with it after check in while in high risk you need to do it before check in. An example is those new to a team or juniors need it much earlier (maybe that is before checkin, maybe that is soon after) than seniors who have shipped twenty sprints on the team. Abhimanyu Singhal | Visual Studio ALM Ranger Depends on per scenario basis. We recommend post check-in reviews when: 1. We don't want to block other checks and processes on manual code reviews. Manual reviews take time, and some pieces may not require manual reviews at all. 2. We need to trace all changes and track history. 3. We have a code promotion strategy/process in place. For risk mitigation, post checkin code can be promoted to Accepted branches. Or can be rejected. Pre Checkin Reviews are used when 1. There is a high risk factor associated 2. Reviewers are generally (most of times) have immediate availability. 3. Team does not have strict tracking needs. Simply speaking, no single process fits all scenarios. You need to select what works best for your team/project. Thomas Schissler | Visual Studio ALM Ranger This is an interesting discussion, I’m right now discussing details about executing code reviews with my teams. I see and understand the aspects you brought in, but there is another side as well, I’d like to point out. 1.) If you do reviews per check in this is not very practical as a hard rule because this will disturb the flow of the team very often or it will lead to reduce the checkin frequency of the devs which I would not accept. 2.) If you do later reviews, for example if you review PBIs, it is not easy to find out which code you should review. Either you review all changesets associate with the PBI, but then you might review code which has been changed with a later checkin and the dev maybe has already fixed the issue. Or you review the diff of the latest changeset of the PBI with the first but then you might also review changes of other PBIs. Jakob Leander | Sr. Director, Avanade In my experience, manual code review: 1. Does not get done and at the very least does not get redone after changes (regardless of intentions at start of project) 2. When a project actually do it, they often do not do it right away = errors pile up 3. Requires a lot of time discussing/defining the standard and for the team to learn it However code review is very important since e.g. even small memory leaks in a high volume web solution have big consequences In the last years I have advocated following approach for code review - Architects up front do “at least one best practice example” of each type of component and tell the team. Copy from this one. This should include error handling, logging, security etc. - Dev lead on project continuously browse code to validate that the best practices are used. Especially that patterns etc. are not broken. You can do this formally after each sprint/iteration if you want. Once this is validated it is unlikely to “go bad” even during later code changes Agree with customer to rely on static code analysis from Visual Studio as the one and only coding standard. This has HUUGE benefits - You can easily tweak to reach the level you desire together with customer - It is easy to measure for both developers/management - It is 100% consistent across code base - It gets validated all the time so you never end up getting hammered by a customer review in the end - It is easy to tell the developer that you do not want code back unless it has zero errors = minimize communication You need to track this at least during nightly builds and make sure team sees total # issues. Do not allow #issues it to grow uncontrolled. On the project I run I require code analysis to have run on code before checkin (checkin rule). This means -  You have to have clean compile (or CA wont run) so this is extra benefit = very few broken builds - You can change a few of the rules to compile as errors instead of warnings. I often do this for “missing dispose” issues which you REALLY do not want in your app Tip: Place your custom CA rules files as part of solution. That  way it works when you do branching etc. (path to CA file is relative in VS) Some may argue that CA is not as good as manual inspection. But since manual inspection in reality suffers from the 3 issues in start it is IMO a MUCH better (and much cheaper) approach from helicopter perspective Tirthankar Dutta | Director, Avanade I think code review should be run both before and after check ins. There are some code metrics that are meant to be run on the entire codebase … Also, especially on multi-site projects, one should strive to architect in a way that lets men manage the framework while boys write the repetitive code… scales very well with the need to review less by containment and imposing architectural restrictions to emphasise the design. Bruno Capuano | Microsoft ALM MVP For code reviews (means peer reviews) in distributed team I use http://www.vsanywhere.com/default.aspx  David Jobling | Global Sr. Director, Avanade Peer review is the only way to scale and its a great practice for all in the team to learn to perform and accept. In my experience you soon learn who's code to watch more than others and tune the attention. Mikkel Toudal Kristiansen | Manager, Avanade If you have several branches in your code base, you will need to merge often. This requires manual merging, when a file has been changed in both branches. It offers a good opportunity to actually review to changed code. So my advice is: Merging between branches should be done as often as possible, it should be done by a senior developer, and he/she should perform a full code review of the code being merged. As for detecting architectural smells and code smells creeping into the code base, one really good third party tools exist: Ndepend (http://www.ndepend.com/, for static code analysis of the current state of the code base). You could also consider adding StyleCop to the solution. Jesse Houwing | Visual Studio ALM Ranger I gave a presentation on this subject on the TechDays conference in NL last year. See my presentation and slides here (talk in Dutch, but English presentation): http://blog.jessehouwing.nl/2012/03/did-you-miss-my-techdaysnl-talk-on-code.html  I’d like to add a few more points: - Before/After checking is mostly a trust issue. If you have a team that does diligent peer reviews and regularly talk/sit together or peer review, there’s no need to enforce a before-checkin policy. The peer peer-programming and regular feedback during development can take care of most of the review requirements as long as the team isn’t under stress. - Under stress, enforce pre-checkin reviews, it might sound strange, if you’re already under time or budgetary constraints, but it is under such conditions most real issues start to be created or pile up. - Use tools to catch most common errors, Code Analysis/FxCop was already mentioned. HP Fortify, Resharper, Coderush etc can help you there. There are also a lot of 3rd party rules you can add to Code Analysis. I’ve written a few myself (http://fccopcontrib.codeplex.com) and various teams from Microsoft have added their own rules (MSOCAF for SharePoint, WSSF for WCF). For common errors that keep cropping up, see if you can define a rule. It’s much easier. But more importantly make sure you have a good help page explaining *WHY* it's wrong. If you have small feature or developer branches/shelvesets, you might want to review pre-merge. It’s still better to do peer reviews and peer programming, but the most important thing is that bad quality code doesn’t make it into the important branch. So my philosophy: - Use tooling as much as possible. - Make sure the team understands the tooling and the importance of the things it flags. It’s too easy to just click suppress all to ignore the warnings. - Under stress, tighten process, it’s under stress that the problems of late reviews will really surface - Most importantly if you do reviews do them as early as possible, but never later than needed. In other words, pre-checkin/post checking doesn’t really matter, as long as the review is done before the code is released. It’ll just be much more expensive to fix any review outcomes the later you find them. --- I would love to hear what you think!

    Read the article

  • Building applications with WCF - Intro

    - by skjagini
    I am going to write series of articles using Windows Communication Framework (WCF) to develop client and server applications and this is the first part of that series. What is WCF As Juwal puts in his Programming WCF book, WCF provides an SDK for developing and deploying services on Windows, provides runtime environment to expose CLR types as services and consume services as CLR types. Building services with WCF is incredibly easy and it’s implementation provides a set of industry standards and off the shelf plumbing including service hosting, instance management, reliability, transaction management, security etc such that it greatly increases productivity Scenario: Lets consider a typical bank customer trying to create an account, deposit amount and transfer funds between accounts, i.e. checking and savings. To make it interesting, we are going to divide the functionality into multiple services and each of them working with database directly. We will run test cases with and without transactional support across services. In this post we will build contracts, services, data access layer, unit tests to verify end to end communication etc, nothing big stuff here and we dig into other features of the WCF in subsequent posts with incremental changes. In any distributed architecture we have two pieces i.e. services and clients. Services as the name implies provide functionality to execute various pieces of business logic on the server, and clients providing interaction to the end user. Services can be built with Web Services or with WCF. Service built on WCF have the advantage of binding independent, i.e. can run against TCP and HTTP protocol without any significant changes to the code. Solution Services Profile: For creating a new bank customer, getting details about existing customer ProfileContract ProfileService Checking Account: To get checking account balance, deposit or withdraw amount CheckingAccountContract CheckingAccountService Savings Account: To get savings account balance, deposit or withdraw amount SavingsAccountContract SavingsAccountService ServiceHost: To host services, i.e. running the services at particular address, binding and contract where client can connect to Client: Helps end user to use services like creating account and amount transfer between the accounts BankDAL: Data access layer to work with database     BankDAL It’s no brainer not to use an ORM as many matured products are available currently in market including Linq2Sql, Entity Framework (EF), LLblGenPro etc. For this exercise I am going to use Entity Framework 4.0, CTP 5 with code first approach. There are two approaches when working with data, data driven and code driven. In data driven we start by designing tables and their constrains in database and generate entities in code while in code driven (code first) approach entities are defined in code and the metadata generated from the entities is used by the EF to create tables and table constrains. In previous versions the entity classes had  to derive from EF specific base classes. In EF 4 it  is not required to derive from any EF classes, the entities are not only persistence ignorant but also enable full test driven development using mock frameworks.  Application consists of 3 entities, Customer entity which contains Customer details; CheckingAccount and SavingsAccount to hold the respective account balance. We could have introduced an Account base class for CheckingAccount and SavingsAccount which is certainly possible with EF mappings but to keep it simple we are just going to follow 1 –1 mapping between entity and table mappings. Lets start out by defining a class called Customer which will be mapped to Customer table, observe that the class is simply a plain old clr object (POCO) and has no reference to EF at all. using System;   namespace BankDAL.Model { public class Customer { public int Id { get; set; } public string FullName { get; set; } public string Address { get; set; } public DateTime DateOfBirth { get; set; } } }   In order to inform EF about the Customer entity we have to define a database context with properties of type DbSet<> for every POCO which needs to be mapped to a table in database. EF uses convention over configuration to generate the metadata resulting in much less configuration. using System.Data.Entity;   namespace BankDAL.Model { public class BankDbContext: DbContext { public DbSet<Customer> Customers { get; set; } } }   Entity constrains can be defined through attributes on Customer class or using fluent syntax (no need to muscle with xml files), CustomerConfiguration class. By defining constrains in a separate class we can maintain clean POCOs without corrupting entity classes with database specific information.   using System; using System.Data.Entity.ModelConfiguration;   namespace BankDAL.Model { public class CustomerConfiguration: EntityTypeConfiguration<Customer> { public CustomerConfiguration() { Initialize(); }   private void Initialize() { //Setting the Primary Key this.HasKey(e => e.Id);   //Setting required fields this.HasRequired(e => e.FullName); this.HasRequired(e => e.Address); //Todo: Can't create required constraint as DateOfBirth is not reference type, research it //this.HasRequired(e => e.DateOfBirth); } } }   Any queries executed against Customers property in BankDbContext are executed against Cusomers table. By convention EF looks for connection string with key of BankDbContext when working with the context.   We are going to define a helper class to work with Customer entity with methods for querying, adding new entity etc and these are known as repository classes, i.e., CustomerRepository   using System; using System.Data.Entity; using System.Linq; using BankDAL.Model;   namespace BankDAL.Repositories { public class CustomerRepository { private readonly IDbSet<Customer> _customers;   public CustomerRepository(BankDbContext bankDbContext) { if (bankDbContext == null) throw new ArgumentNullException(); _customers = bankDbContext.Customers; }   public IQueryable<Customer> Query() { return _customers; }   public void Add(Customer customer) { _customers.Add(customer); } } }   From the above code it is observable that the Query methods returns customers as IQueryable i.e. customers are retrieved only when actually used i.e. iterated. Returning as IQueryable also allows to execute filtering and joining statements from business logic using lamba expressions without cluttering the data access layer with tens of methods.   Our CheckingAccountRepository and SavingsAccountRepository look very similar to each other using System; using System.Data.Entity; using System.Linq; using BankDAL.Model;   namespace BankDAL.Repositories { public class CheckingAccountRepository { private readonly IDbSet<CheckingAccount> _checkingAccounts;   public CheckingAccountRepository(BankDbContext bankDbContext) { if (bankDbContext == null) throw new ArgumentNullException(); _checkingAccounts = bankDbContext.CheckingAccounts; }   public IQueryable<CheckingAccount> Query() { return _checkingAccounts; }   public void Add(CheckingAccount account) { _checkingAccounts.Add(account); }   public IQueryable<CheckingAccount> GetAccount(int customerId) { return (from act in _checkingAccounts where act.CustomerId == customerId select act); }   } } The repository classes look very similar to each other for Query and Add methods, with the help of C# generics and implementing repository pattern (Martin Fowler) we can reduce the repeated code. Jarod from ElegantCode has posted an article on how to use repository pattern with EF which we will implement in the subsequent articles along with WCF Unity life time managers by Drew Contracts It is very easy to follow contract first approach with WCF, define the interface and append ServiceContract, OperationContract attributes. IProfile contract exposes functionality for creating customer and getting customer details.   using System; using System.ServiceModel; using BankDAL.Model;   namespace ProfileContract { [ServiceContract] public interface IProfile { [OperationContract] Customer CreateCustomer(string customerName, string address, DateTime dateOfBirth);   [OperationContract] Customer GetCustomer(int id);   } }   ICheckingAccount contract exposes functionality for working with checking account, i.e., getting balance, deposit and withdraw of amount. ISavingsAccount contract looks the same as checking account.   using System.ServiceModel;   namespace CheckingAccountContract { [ServiceContract] public interface ICheckingAccount { [OperationContract] decimal? GetCheckingAccountBalance(int customerId);   [OperationContract] void DepositAmount(int customerId,decimal amount);   [OperationContract] void WithdrawAmount(int customerId, decimal amount);   } }   Services   Having covered the data access layer and contracts so far and here comes the core of the business logic, i.e. services.   .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } ProfileService implements the IProfile contract for creating customer and getting customer detail using CustomerRepository. using System; using System.Linq; using System.ServiceModel; using BankDAL; using BankDAL.Model; using BankDAL.Repositories; using ProfileContract;   namespace ProfileService { [ServiceBehavior(IncludeExceptionDetailInFaults = true)] public class Profile: IProfile { public Customer CreateAccount( string customerName, string address, DateTime dateOfBirth) { Customer cust = new Customer { FullName = customerName, Address = address, DateOfBirth = dateOfBirth };   using (var bankDbContext = new BankDbContext()) { new CustomerRepository(bankDbContext).Add(cust); bankDbContext.SaveChanges(); } return cust; }   public Customer CreateCustomer(string customerName, string address, DateTime dateOfBirth) { return CreateAccount(customerName, address, dateOfBirth); } public Customer GetCustomer(int id) { return new CustomerRepository(new BankDbContext()).Query() .Where(i => i.Id == id).FirstOrDefault(); }   } } From the above code you shall observe that we are calling bankDBContext’s SaveChanges method and there is no save method specific to customer entity because EF manages all the changes centralized at the context level and all the pending changes so far are submitted in a batch and it is represented as Unit of Work. Similarly Checking service implements ICheckingAccount contract using CheckingAccountRepository, notice that we are throwing overdraft exception if the balance falls by zero. WCF has it’s own way of raising exceptions using fault contracts which will be explained in the subsequent articles. SavingsAccountService is similar to CheckingAccountService. using System; using System.Linq; using System.ServiceModel; using BankDAL.Model; using BankDAL.Repositories; using CheckingAccountContract;   namespace CheckingAccountService { [ServiceBehavior(IncludeExceptionDetailInFaults = true)] public class Checking:ICheckingAccount { public decimal? GetCheckingAccountBalance(int customerId) { using (var bankDbContext = new BankDbContext()) { CheckingAccount account = (new CheckingAccountRepository(bankDbContext) .GetAccount(customerId)).FirstOrDefault();   if (account != null) return account.Balance;   return null; } }   public void DepositAmount(int customerId, decimal amount) { using(var bankDbContext = new BankDbContext()) { var checkingAccountRepository = new CheckingAccountRepository(bankDbContext); CheckingAccount account = (checkingAccountRepository.GetAccount(customerId)) .FirstOrDefault();   if (account == null) { account = new CheckingAccount() { CustomerId = customerId }; checkingAccountRepository.Add(account); }   account.Balance = account.Balance + amount; if (account.Balance < 0) throw new ApplicationException("Overdraft not accepted");   bankDbContext.SaveChanges(); } } public void WithdrawAmount(int customerId, decimal amount) { DepositAmount(customerId, -1*amount); } } }   BankServiceHost The host acts as a glue binding contracts with it’s services, exposing the endpoints. The services can be exposed either through the code or configuration file, configuration file is preferred as it allows run time changes to service behavior even after deployment. We have 3 services and for each of the service you need to define name (the class that implements the service with fully qualified namespace) and endpoint known as ABC, i.e. address, binding and contract. We are using netTcpBinding and have defined the base address with for each of the contracts .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } <system.serviceModel> <services> <service name="ProfileService.Profile"> <endpoint binding="netTcpBinding" contract="ProfileContract.IProfile"/> <host> <baseAddresses> <add baseAddress="net.tcp://localhost:1000/Profile"/> </baseAddresses> </host> </service> <service name="CheckingAccountService.Checking"> <endpoint binding="netTcpBinding" contract="CheckingAccountContract.ICheckingAccount"/> <host> <baseAddresses> <add baseAddress="net.tcp://localhost:1000/Checking"/> </baseAddresses> </host> </service> <service name="SavingsAccountService.Savings"> <endpoint binding="netTcpBinding" contract="SavingsAccountContract.ISavingsAccount"/> <host> <baseAddresses> <add baseAddress="net.tcp://localhost:1000/Savings"/> </baseAddresses> </host> </service> </services> </system.serviceModel> Have to open the services by creating service host which will handle the incoming requests from clients.   using System;   namespace ServiceHost { class Program { static void Main(string[] args) { CreateHosts(); Console.ReadLine(); }   private static void CreateHosts() { CreateHost(typeof(ProfileService.Profile),"Profile Service"); CreateHost(typeof(SavingsAccountService.Savings), "Savings Account Service"); CreateHost(typeof(CheckingAccountService.Checking), "Checking Account Service"); }   private static void CreateHost(Type type, string hostDescription) { System.ServiceModel.ServiceHost host = new System.ServiceModel.ServiceHost(type); host.Open();   if (host.ChannelDispatchers != null && host.ChannelDispatchers.Count != 0 && host.ChannelDispatchers[0].Listener != null) Console.WriteLine("Started: " + host.ChannelDispatchers[0].Listener.Uri); else Console.WriteLine("Failed to start:" + hostDescription); } } } BankClient    The client has no knowledge about service business logic other than the functionality it exposes through the contract, end points and a proxy to work against. The endpoint data and server proxy can be generated by right clicking on the project reference and choosing ‘Add Service Reference’ and entering the service end point address. Or if you have access to source, you can manually reference contract dlls and update clients configuration file to point to the service end point if the server and client happens to be being built using .Net framework. One of the pros with the manual approach is you don’t have to work against messy code generated files.   <system.serviceModel> <client> <endpoint name="tcpProfile" address="net.tcp://localhost:1000/Profile" binding="netTcpBinding" contract="ProfileContract.IProfile"/> <endpoint name="tcpCheckingAccount" address="net.tcp://localhost:1000/Checking" binding="netTcpBinding" contract="CheckingAccountContract.ICheckingAccount"/> <endpoint name="tcpSavingsAccount" address="net.tcp://localhost:1000/Savings" binding="netTcpBinding" contract="SavingsAccountContract.ISavingsAccount"/>   </client> </system.serviceModel> The client uses a façade to connect to the services   using System.ServiceModel; using CheckingAccountContract; using ProfileContract; using SavingsAccountContract;   namespace Client { public class ProxyFacade { public static IProfile ProfileProxy() { return (new ChannelFactory<IProfile>("tcpProfile")).CreateChannel(); }   public static ICheckingAccount CheckingAccountProxy() { return (new ChannelFactory<ICheckingAccount>("tcpCheckingAccount")) .CreateChannel(); }   public static ISavingsAccount SavingsAccountProxy() { return (new ChannelFactory<ISavingsAccount>("tcpSavingsAccount")) .CreateChannel(); }   } }   With that in place, lets get our unit tests going   using System; using System.Diagnostics; using BankDAL.Model; using NUnit.Framework; using ProfileContract;   namespace Client { [TestFixture] public class Tests { private void TransferFundsFromSavingsToCheckingAccount(int customerId, decimal amount) { ProxyFacade.CheckingAccountProxy().DepositAmount(customerId, amount); ProxyFacade.SavingsAccountProxy().WithdrawAmount(customerId, amount); }   private void TransferFundsFromCheckingToSavingsAccount(int customerId, decimal amount) { ProxyFacade.SavingsAccountProxy().DepositAmount(customerId, amount); ProxyFacade.CheckingAccountProxy().WithdrawAmount(customerId, amount); }     [Test] public void CreateAndGetProfileTest() { IProfile profile = ProxyFacade.ProfileProxy(); const string customerName = "Tom"; int customerId = profile.CreateCustomer(customerName, "NJ", new DateTime(1982, 1, 1)).Id; Customer customer = profile.GetCustomer(customerId); Assert.AreEqual(customerName,customer.FullName); }   [Test] public void DepositWithDrawAndTransferAmountTest() { IProfile profile = ProxyFacade.ProfileProxy(); string customerName = "Smith" + DateTime.Now.ToString("HH:mm:ss"); var customer = profile.CreateCustomer(customerName, "NJ", new DateTime(1982, 1, 1)); // Deposit to Savings ProxyFacade.SavingsAccountProxy().DepositAmount(customer.Id, 100); ProxyFacade.SavingsAccountProxy().DepositAmount(customer.Id, 25); Assert.AreEqual(125, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customer.Id)); // Withdraw ProxyFacade.SavingsAccountProxy().WithdrawAmount(customer.Id, 30); Assert.AreEqual(95, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customer.Id));   // Deposit to Checking ProxyFacade.CheckingAccountProxy().DepositAmount(customer.Id, 60); ProxyFacade.CheckingAccountProxy().DepositAmount(customer.Id, 40); Assert.AreEqual(100, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customer.Id)); // Withdraw ProxyFacade.CheckingAccountProxy().WithdrawAmount(customer.Id, 30); Assert.AreEqual(70, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customer.Id));   // Transfer from Savings to Checking TransferFundsFromSavingsToCheckingAccount(customer.Id,10); Assert.AreEqual(85, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customer.Id)); Assert.AreEqual(80, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customer.Id));   // Transfer from Checking to Savings TransferFundsFromCheckingToSavingsAccount(customer.Id, 50); Assert.AreEqual(135, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customer.Id)); Assert.AreEqual(30, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customer.Id)); }   [Test] public void FundTransfersWithOverDraftTest() { IProfile profile = ProxyFacade.ProfileProxy(); string customerName = "Angelina" + DateTime.Now.ToString("HH:mm:ss");   var customerId = profile.CreateCustomer(customerName, "NJ", new DateTime(1972, 1, 1)).Id;   ProxyFacade.SavingsAccountProxy().DepositAmount(customerId, 100); TransferFundsFromSavingsToCheckingAccount(customerId,80); Assert.AreEqual(20, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customerId)); Assert.AreEqual(80, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customerId));   try { TransferFundsFromSavingsToCheckingAccount(customerId,30); } catch (Exception e) { Debug.WriteLine(e.Message); }   Assert.AreEqual(110, ProxyFacade.CheckingAccountProxy().GetCheckingAccountBalance(customerId)); Assert.AreEqual(20, ProxyFacade.SavingsAccountProxy().GetSavingsAccountBalance(customerId)); } } }   We are creating a new instance of the channel for every operation, we will look into instance management and how creating a new instance of channel affects it in subsequent articles. The first two test cases deals with creation of Customer, deposit and withdraw of month between accounts. The last case, FundTransferWithOverDraftTest() is interesting. Customer starts with depositing $100 in SavingsAccount followed by transfer of $80 in to checking account resulting in $20 in savings account.  Customer then initiates $30 transfer from Savings to Checking resulting in overdraft exception on Savings with $30 being deposited to Checking. As we are not running both the requests in transactions the customer ends up with more amount than what he started with $100. In subsequent posts we will look into transactions handling.  Make sure the ServiceHost project is set as start up project and start the solution. Run the test cases either from NUnit client or TestDriven.Net/Resharper which ever is your favorite tool. Make sure you have updated the data base connection string in the ServiceHost config file to point to your local database

    Read the article

  • Calculating the Size (in Bytes and MB) of a Oracle Coherence Cache

    - by Ricardo Ferreira
    The concept and usage of data grids are becoming very popular in this days since this type of technology are evolving very fast with some cool lead products like Oracle Coherence. Once for a while, developers need an programmatic way to calculate the total size of a specific cache that are residing in the data grid. In this post, I will show how to accomplish this using Oracle Coherence API. This example has been tested with 3.6, 3.7 and 3.7.1 versions of Oracle Coherence. To start the development of this example, you need to create a POJO ("Plain Old Java Object") that represents a data structure that will hold user data. This data structure will also create an internal fat so I call that should increase considerably the size of each instance in the heap memory. Create a Java class named "Person" as shown in the listing below. package com.oracle.coherence.domain; import java.io.Serializable; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Random; @SuppressWarnings("serial") public class Person implements Serializable { private String firstName; private String lastName; private List<Object> fat; private String email; public Person() { generateFat(); } public Person(String firstName, String lastName, String email) { setFirstName(firstName); setLastName(lastName); setEmail(email); generateFat(); } private void generateFat() { fat = new ArrayList<Object>(); Random random = new Random(); for (int i = 0; i < random.nextInt(18000); i++) { HashMap<Long, Double> internalFat = new HashMap<Long, Double>(); for (int j = 0; j < random.nextInt(10000); j++) { internalFat.put(random.nextLong(), random.nextDouble()); } fat.add(internalFat); } } public String getFirstName() { return firstName; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getLastName() { return lastName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getEmail() { return email; } public void setEmail(String email) { this.email = email; } } Now let's create a Java program that will start a data grid into Coherence and will create a cache named "People", that will hold people instances with sequential integer keys. Each person created in this program will trigger the execution of a custom constructor created in the People class that instantiates an internal fat (the random amount of data generated to increase the size of the object) for each person. Create a Java class named "CreatePeopleCacheAndPopulateWithData" as shown in the listing below. package com.oracle.coherence.demo; import com.oracle.coherence.domain.Person; import com.tangosol.net.CacheFactory; import com.tangosol.net.NamedCache; public class CreatePeopleCacheAndPopulateWithData { public static void main(String[] args) { // Asks Coherence for a new cache named "People"... NamedCache people = CacheFactory.getCache("People"); // Creates three people that will be putted into the data grid. Each person // generates an internal fat that should increase its size in terms of bytes... Person pessoa1 = new Person("Ricardo", "Ferreira", "[email protected]"); Person pessoa2 = new Person("Vitor", "Ferreira", "[email protected]"); Person pessoa3 = new Person("Vivian", "Ferreira", "[email protected]"); // Insert three people at the data grid... people.put(1, pessoa1); people.put(2, pessoa2); people.put(3, pessoa3); // Waits for 5 minutes until the user runs the Java program // that calculates the total size of the people cache... try { System.out.println("---> Waiting for 5 minutes for the cache size calculation..."); Thread.sleep(300000); } catch (InterruptedException ie) { ie.printStackTrace(); } } } Finally, let's create a Java program that, using the Coherence API and JMX, will calculate the total size of each cache that the data grid is currently managing. The approach used in this example was retrieve every cache that the data grid are currently managing, but if you are interested on an specific cache, the same approach can be used, you should only filter witch cache will be looked for. Create a Java class named "CalculateTheSizeOfPeopleCache" as shown in the listing below. package com.oracle.coherence.demo; import java.text.DecimalFormat; import java.util.Map; import java.util.Set; import java.util.TreeMap; import javax.management.MBeanServer; import javax.management.MBeanServerFactory; import javax.management.ObjectName; import com.tangosol.net.CacheFactory; public class CalculateTheSizeOfPeopleCache { @SuppressWarnings({ "unchecked", "rawtypes" }) private void run() throws Exception { // Enable JMX support in this Coherence data grid session... System.setProperty("tangosol.coherence.management", "all"); // Create a sample cache just to access the data grid... CacheFactory.getCache(MBeanServerFactory.class.getName()); // Gets the JMX server from Coherence data grid... MBeanServer jmxServer = getJMXServer(); // Creates a internal data structure that would maintain // the statistics from each cache in the data grid... Map cacheList = new TreeMap(); Set jmxObjectList = jmxServer.queryNames(new ObjectName("Coherence:type=Cache,*"), null); for (Object jmxObject : jmxObjectList) { ObjectName jmxObjectName = (ObjectName) jmxObject; String cacheName = jmxObjectName.getKeyProperty("name"); if (cacheName.equals(MBeanServerFactory.class.getName())) { continue; } else { cacheList.put(cacheName, new Statistics(cacheName)); } } // Updates the internal data structure with statistic data // retrieved from caches inside the in-memory data grid... Set<String> cacheNames = cacheList.keySet(); for (String cacheName : cacheNames) { Set resultSet = jmxServer.queryNames( new ObjectName("Coherence:type=Cache,name=" + cacheName + ",*"), null); for (Object resultSetRef : resultSet) { ObjectName objectName = (ObjectName) resultSetRef; if (objectName.getKeyProperty("tier").equals("back")) { int unit = (Integer) jmxServer.getAttribute(objectName, "Units"); int size = (Integer) jmxServer.getAttribute(objectName, "Size"); Statistics statistics = (Statistics) cacheList.get(cacheName); statistics.incrementUnit(unit); statistics.incrementSize(size); cacheList.put(cacheName, statistics); } } } // Finally... print the objects from the internal data // structure that represents the statistics from caches... cacheNames = cacheList.keySet(); for (String cacheName : cacheNames) { Statistics estatisticas = (Statistics) cacheList.get(cacheName); System.out.println(estatisticas); } } public MBeanServer getJMXServer() { MBeanServer jmxServer = null; for (Object jmxServerRef : MBeanServerFactory.findMBeanServer(null)) { jmxServer = (MBeanServer) jmxServerRef; if (jmxServer.getDefaultDomain().equals(DEFAULT_DOMAIN) || DEFAULT_DOMAIN.length() == 0) { break; } jmxServer = null; } if (jmxServer == null) { jmxServer = MBeanServerFactory.createMBeanServer(DEFAULT_DOMAIN); } return jmxServer; } private class Statistics { private long unit; private long size; private String cacheName; public Statistics(String cacheName) { this.cacheName = cacheName; } public void incrementUnit(long unit) { this.unit += unit; } public void incrementSize(long size) { this.size += size; } public long getUnit() { return unit; } public long getSize() { return size; } public double getUnitInMB() { return unit / (1024.0 * 1024.0); } public double getAverageSize() { return size == 0 ? 0 : unit / size; } public String toString() { StringBuffer sb = new StringBuffer(); sb.append("\nCache Statistics of '").append(cacheName).append("':\n"); sb.append(" - Total Entries of Cache -----> " + getSize()).append("\n"); sb.append(" - Used Memory (Bytes) --------> " + getUnit()).append("\n"); sb.append(" - Used Memory (MB) -----------> " + FORMAT.format(getUnitInMB())).append("\n"); sb.append(" - Object Average Size --------> " + FORMAT.format(getAverageSize())).append("\n"); return sb.toString(); } } public static void main(String[] args) throws Exception { new CalculateTheSizeOfPeopleCache().run(); } public static final DecimalFormat FORMAT = new DecimalFormat("###.###"); public static final String DEFAULT_DOMAIN = ""; public static final String DOMAIN_NAME = "Coherence"; } I've commented the overall example so, I don't think that you should get into trouble to understand it. Basically we are dealing with JMX. The first thing to do is enable JMX support for the Coherence client (ie, an JVM that will only retrieve values from the data grid and will not integrate the cluster) application. This can be done very easily using the runtime "tangosol.coherence.management" system property. Consult the Coherence documentation for JMX to understand the possible values that could be applied. The program creates an in memory data structure that holds a custom class created called "Statistics". This class represents the information that we are interested to see, which in this case are the size in bytes and in MB of the caches. An instance of this class is created for each cache that are currently managed by the data grid. Using JMX specific methods, we retrieve the information that are relevant for calculate the total size of the caches. To test this example, you should execute first the CreatePeopleCacheAndPopulateWithData.java program and after the CreatePeopleCacheAndPopulateWithData.java program. The results in the console should be something like this: 2012-06-23 13:29:31.188/4.970 Oracle Coherence 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded operational configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/tangosol-coherence.xml" 2012-06-23 13:29:31.219/5.001 Oracle Coherence 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded operational overrides from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/tangosol-coherence-override-dev.xml" 2012-06-23 13:29:31.219/5.001 Oracle Coherence 3.6.0.4 <D5> (thread=Main Thread, member=n/a): Optional configuration override "/tangosol-coherence-override.xml" is not specified 2012-06-23 13:29:31.266/5.048 Oracle Coherence 3.6.0.4 <D5> (thread=Main Thread, member=n/a): Optional configuration override "/custom-mbeans.xml" is not specified Oracle Coherence Version 3.6.0.4 Build 19111 Grid Edition: Development mode Copyright (c) 2000, 2010, Oracle and/or its affiliates. All rights reserved. 2012-06-23 13:29:33.156/6.938 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded Reporter configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/reports/report-group.xml" 2012-06-23 13:29:33.500/7.282 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Loaded cache configuration from "jar:file:/E:/Oracle/Middleware/oepe_11gR1PS4/workspace/calcular-tamanho-cache-coherence/lib/coherence.jar!/coherence-cache-config.xml" 2012-06-23 13:29:35.391/9.173 Oracle Coherence GE 3.6.0.4 <D4> (thread=Main Thread, member=n/a): TCMP bound to /192.168.177.133:8090 using SystemSocketProvider 2012-06-23 13:29:37.062/10.844 Oracle Coherence GE 3.6.0.4 <Info> (thread=Cluster, member=n/a): This Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle, Edition=Grid Edition, Mode=Development, CpuCount=2, SocketCount=2) joined cluster "cluster:0xC4DB" with senior Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith, Edition=Grid Edition, Mode=Development, CpuCount=2, SocketCount=2) 2012-06-23 13:29:37.172/10.954 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service Cluster with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service Management with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <D5> (thread=Cluster, member=n/a): Member 1 joined Service DistributedCache with senior member 1 2012-06-23 13:29:37.188/10.970 Oracle Coherence GE 3.6.0.4 <Info> (thread=Main Thread, member=n/a): Started cluster Name=cluster:0xC4DB Group{Address=224.3.6.0, Port=36000, TTL=4} MasterMemberSet ( ThisMember=Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle) OldestMember=Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith) ActualMemberSet=MemberSet(Size=2, BitSetCount=2 Member(Id=1, Timestamp=2012-06-23 13:29:14.031, Address=192.168.177.133:8088, MachineId=55685, Location=process:1128, Role=CreatePeopleCacheAndPopulateWith) Member(Id=2, Timestamp=2012-06-23 13:29:36.899, Address=192.168.177.133:8090, MachineId=55685, Location=process:244, Role=Oracle) ) RecycleMillis=1200000 RecycleSet=MemberSet(Size=0, BitSetCount=0 ) ) TcpRing{Connections=[1]} IpMonitor{AddressListSize=0} 2012-06-23 13:29:37.891/11.673 Oracle Coherence GE 3.6.0.4 <D5> (thread=Invocation:Management, member=2): Service Management joined the cluster with senior service member 1 2012-06-23 13:29:39.203/12.985 Oracle Coherence GE 3.6.0.4 <D5> (thread=DistributedCache, member=2): Service DistributedCache joined the cluster with senior service member 1 2012-06-23 13:29:39.297/13.079 Oracle Coherence GE 3.6.0.4 <D4> (thread=DistributedCache, member=2): Asking member 1 for 128 primary partitions Cache Statistics of 'People': - Total Entries of Cache -----> 3 - Used Memory (Bytes) --------> 883920 - Used Memory (MB) -----------> 0.843 - Object Average Size --------> 294640 I hope that this post could save you some time when calculate the total size of Coherence cache became a requirement for your high scalable system using data grids. See you!

    Read the article

  • Creating ASP.NET MVC Negotiated Content Results

    - by Rick Strahl
    In a recent ASP.NET MVC application I’m involved with, we had a late in the process request to handle Content Negotiation: Returning output based on the HTTP Accept header of the incoming HTTP request. This is standard behavior in ASP.NET Web API but ASP.NET MVC doesn’t support this functionality directly out of the box. Another reason this came up in discussion is last week’s announcements of ASP.NET vNext, which seems to indicate that ASP.NET Web API is not going to be ported to the cloud version of vNext, but rather be replaced by a combined version of MVC and Web API. While it’s not clear what new API features will show up in this new framework, it’s pretty clear that the ASP.NET MVC style syntax will be the new standard for all the new combined HTTP processing framework. Why negotiated Content? Content negotiation is one of the key features of Web API even though it’s such a relatively simple thing. But it’s also something that’s missing in MVC and once you get used to automatically having your content returned based on Accept headers it’s hard to go back to manually having to create separate methods for different output types as you’ve had to with Microsoft server technologies all along (yes, yes I know other frameworks – including my own – have done this for years but for in the box features this is relatively new from Web API). As a quick review,  Accept Header content negotiation works off the request’s HTTP Accept header:POST http://localhost/mydailydosha/Editable/NegotiateContent HTTP/1.1 Content-Type: application/json Accept: application/json Host: localhost Content-Length: 76 Pragma: no-cache { ElementId: "header", PageName: "TestPage", Text: "This is a nice header" } If I make this request I would expect to get back a JSON result based on my application/json Accept header. To request XML  I‘d just change the accept header:Accept: text/xml and now I’d expect the response to come back as XML. Now this only works with media types that the server can process. In my case here I need to handle JSON, XML, HTML (using Views) and Plain Text. HTML results might need more than just a data return – you also probably need to specify a View to render the data into either by specifying the view explicitly or by using some sort of convention that can automatically locate a view to match. Today ASP.NET MVC doesn’t support this sort of automatic content switching out of the box. Unfortunately, in my application scenario we have an application that started out primarily with an AJAX backend that was implemented with JSON only. So there are lots of JSON results like this:[Route("Customers")] public ActionResult GetCustomers() { return Json(repo.GetCustomers(),JsonRequestBehavior.AllowGet); } These work fine, but they are of course JSON specific. Then a couple of weeks ago, a requirement came in that an old desktop application needs to also consume this API and it has to use XML to do it because there’s no JSON parser available for it. Ooops – stuck with JSON in this case. While it would have been easy to add XML specific methods I figured it’s easier to add basic content negotiation. And that’s what I show in this post. Missteps – IResultFilter, IActionFilter My first attempt at this was to use IResultFilter or IActionFilter which look like they would be ideal to modify result content after it’s been generated using OnResultExecuted() or OnActionExecuted(). Filters are great because they can look globally at all controller methods or individual methods that are marked up with the Filter’s attribute. But it turns out these filters don’t work for raw POCO result values from Action methods. What we wanted to do for API calls is get back to using plain .NET types as results rather than result actions. That is  you write a method that doesn’t return an ActionResult, but a standard .NET type like this:public Customer UpdateCustomer(Customer cust) { … do stuff to customer :-) return cust; } Unfortunately both OnResultExecuted and OnActionExecuted receive an MVC ContentResult instance from the POCO object. MVC basically takes any non-ActionResult return value and turns it into a ContentResult by converting the value using .ToString(). Ugh. The ContentResult itself doesn’t contain the original value, which is lost AFAIK with no way to retrieve it. So there’s no way to access the raw customer object in the example above. Bummer. Creating a NegotiatedResult This leaves mucking around with custom ActionResults. ActionResults are MVC’s standard way to return action method results – you basically specify that you would like to render your result in a specific format. Common ActionResults are ViewResults (ie. View(vn,model)), JsonResult, RedirectResult etc. They work and are fairly effective and work fairly well for testing as well as it’s the ‘standard’ interface to return results from actions. The problem with the this is mainly that you’re explicitly saying that you want a specific result output type. This works well for many things, but sometimes you do want your result to be negotiated. My first crack at this solution here is to create a simple ActionResult subclass that looks at the Accept header and based on that writes the output. I need to support JSON and XML content and HTML as well as text – so effectively 4 media types: application/json, text/xml, text/html and text/plain. Everything else is passed through as ContentResult – which effecively returns whatever .ToString() returns. Here’s what the NegotiatedResult usage looks like:public ActionResult GetCustomers() { return new NegotiatedResult(repo.GetCustomers()); } public ActionResult GetCustomer(int id) { return new NegotiatedResult("Show", repo.GetCustomer(id)); } There are two overloads of this method – one that returns just the raw result value and a second version that accepts an optional view name. The second version returns the Razor view specified only if text/html is requested – otherwise the raw data is returned. This is useful in applications where you have an HTML front end that can also double as an API interface endpoint that’s using the same model data you send to the View. For the application I mentioned above this was another actual use-case we needed to address so this was a welcome side effect of creating a custom ActionResult. There’s also an extension method that directly attaches a Negotiated() method to the controller using the same syntax:public ActionResult GetCustomers() { return this.Negotiated(repo.GetCustomers()); } public ActionResult GetCustomer(int id) { return this.Negotiated("Show",repo.GetCustomer(id)); } Using either of these mechanisms now allows you to return JSON, XML, HTML or plain text results depending on the Accept header sent. Send application/json you get just the Customer JSON data. Ditto for text/xml and XML data. Pass text/html for the Accept header and the "Show.cshtml" Razor view is rendered passing the result model data producing final HTML output. While this isn’t as clean as passing just POCO objects back as I had intended originally, this approach fits better with how MVC action methods are intended to be used and we get the bonus of being able to specify a View to render (optionally) for HTML. How does it work An ActionResult implementation is pretty straightforward. You inherit from ActionResult and implement the ExecuteResult method to send your output to the ASP.NET output stream. ActionFilters are an easy way to effectively do post processing on ASP.NET MVC controller actions just before the content is sent to the output stream, assuming your specific action result was used. Here’s the full code to the NegotiatedResult class (you can also check it out on GitHub):/// <summary> /// Returns a content negotiated result based on the Accept header. /// Minimal implementation that works with JSON and XML content, /// can also optionally return a view with HTML. /// </summary> /// <example> /// // model data only /// public ActionResult GetCustomers() /// { /// return new NegotiatedResult(repo.Customers.OrderBy( c=> c.Company) ) /// } /// // optional view for HTML /// public ActionResult GetCustomers() /// { /// return new NegotiatedResult("List", repo.Customers.OrderBy( c=> c.Company) ) /// } /// </example> public class NegotiatedResult : ActionResult { /// <summary> /// Data stored to be 'serialized'. Public /// so it's potentially accessible in filters. /// </summary> public object Data { get; set; } /// <summary> /// Optional name of the HTML view to be rendered /// for HTML responses /// </summary> public string ViewName { get; set; } public static bool FormatOutput { get; set; } static NegotiatedResult() { FormatOutput = HttpContext.Current.IsDebuggingEnabled; } /// <summary> /// Pass in data to serialize /// </summary> /// <param name="data">Data to serialize</param> public NegotiatedResult(object data) { Data = data; } /// <summary> /// Pass in data and an optional view for HTML views /// </summary> /// <param name="data"></param> /// <param name="viewName"></param> public NegotiatedResult(string viewName, object data) { Data = data; ViewName = viewName; } public override void ExecuteResult(ControllerContext context) { if (context == null) throw new ArgumentNullException("context"); HttpResponseBase response = context.HttpContext.Response; HttpRequestBase request = context.HttpContext.Request; // Look for specific content types if (request.AcceptTypes.Contains("text/html")) { response.ContentType = "text/html"; if (!string.IsNullOrEmpty(ViewName)) { var viewData = context.Controller.ViewData; viewData.Model = Data; var viewResult = new ViewResult { ViewName = ViewName, MasterName = null, ViewData = viewData, TempData = context.Controller.TempData, ViewEngineCollection = ((Controller)context.Controller).ViewEngineCollection }; viewResult.ExecuteResult(context.Controller.ControllerContext); } else response.Write(Data); } else if (request.AcceptTypes.Contains("text/plain")) { response.ContentType = "text/plain"; response.Write(Data); } else if (request.AcceptTypes.Contains("application/json")) { using (JsonTextWriter writer = new JsonTextWriter(response.Output)) { var settings = new JsonSerializerSettings(); if (FormatOutput) settings.Formatting = Newtonsoft.Json.Formatting.Indented; JsonSerializer serializer = JsonSerializer.Create(settings); serializer.Serialize(writer, Data); writer.Flush(); } } else if (request.AcceptTypes.Contains("text/xml")) { response.ContentType = "text/xml"; if (Data != null) { using (var writer = new XmlTextWriter(response.OutputStream, new UTF8Encoding())) { if (FormatOutput) writer.Formatting = System.Xml.Formatting.Indented; XmlSerializer serializer = new XmlSerializer(Data.GetType()); serializer.Serialize(writer, Data); writer.Flush(); } } } else { // just write data as a plain string response.Write(Data); } } } /// <summary> /// Extends Controller with Negotiated() ActionResult that does /// basic content negotiation based on the Accept header. /// </summary> public static class NegotiatedResultExtensions { /// <summary> /// Return content-negotiated content of the data based on Accept header. /// Supports: /// application/json - using JSON.NET /// text/xml - Xml as XmlSerializer XML /// text/html - as text, or an optional View /// text/plain - as text /// </summary> /// <param name="controller"></param> /// <param name="data">Data to return</param> /// <returns>serialized data</returns> /// <example> /// public ActionResult GetCustomers() /// { /// return this.Negotiated( repo.Customers.OrderBy( c=> c.Company) ) /// } /// </example> public static NegotiatedResult Negotiated(this Controller controller, object data) { return new NegotiatedResult(data); } /// <summary> /// Return content-negotiated content of the data based on Accept header. /// Supports: /// application/json - using JSON.NET /// text/xml - Xml as XmlSerializer XML /// text/html - as text, or an optional View /// text/plain - as text /// </summary> /// <param name="controller"></param> /// <param name="viewName">Name of the View to when Accept is text/html</param> /// /// <param name="data">Data to return</param> /// <returns>serialized data</returns> /// <example> /// public ActionResult GetCustomers() /// { /// return this.Negotiated("List", repo.Customers.OrderBy( c=> c.Company) ) /// } /// </example> public static NegotiatedResult Negotiated(this Controller controller, string viewName, object data) { return new NegotiatedResult(viewName, data); } } Output Generation – JSON and XML Generating output for XML and JSON is simple – you use the desired serializer and off you go. Using XmlSerializer and JSON.NET it’s just a handful of lines each to generate serialized output directly into the HTTP output stream. Please note this implementation uses JSON.NET for its JSON generation rather than the default JavaScriptSerializer that MVC uses which I feel is an additional bonus to implementing this custom action. I’d already been using a custom JsonNetResult class previously, but now this is just rolled into this custom ActionResult. Just keep in mind that JSON.NET outputs slightly different JSON for certain things like collections for example, so behavior may change. One addition to this implementation might be a flag to allow switching the JSON serializer. Html View Generation Html View generation actually turned out to be easier than anticipated. Initially I used my generic ASP.NET ViewRenderer Class that can render MVC views from any ASP.NET application. However it turns out since we are executing inside of an active MVC request there’s an easier way: We can simply create a custom ViewResult and populate its members and then execute it. The code in text/html handling code that renders the view is simply this:response.ContentType = "text/html"; if (!string.IsNullOrEmpty(ViewName)) { var viewData = context.Controller.ViewData; viewData.Model = Data; var viewResult = new ViewResult { ViewName = ViewName, MasterName = null, ViewData = viewData, TempData = context.Controller.TempData, ViewEngineCollection = ((Controller)context.Controller).ViewEngineCollection }; viewResult.ExecuteResult(context.Controller.ControllerContext); } else response.Write(Data); which is a neat and easy way to render a Razor view assuming you have an active controller that’s ready for rendering. Sweet – dependency removed which makes this class self-contained without any external dependencies other than JSON.NET. Summary While this isn’t exactly a new topic, it’s the first time I’ve actually delved into this with MVC. I’ve been doing content negotiation with Web API and prior to that with my REST library. This is the first time it’s come up as an issue in MVC. But as I have worked through this I find that having a way to specify both HTML Views *and* JSON and XML results from a single controller certainly is appealing to me in many situations as we are in this particular application returning identical data models for each of these operations. Rendering content negotiated views is something that I hope ASP.NET vNext will provide natively in the combined MVC and WebAPI model, but we’ll see how this actually will be implemented. In the meantime having a custom ActionResult that provides this functionality is a workable and easily adaptable way of handling this going forward. Whatever ends up happening in ASP.NET vNext the abstraction can probably be changed to support the native features of the future. Anyway I hope some of you found this useful if not for direct integration then as insight into some of the rendering logic that MVC uses to get output into the HTTP stream… Related Resources Latest Version of NegotiatedResult.cs on GitHub Understanding Action Controllers Rendering ASP.NET Views To String© Rick Strahl, West Wind Technologies, 2005-2014Posted in MVC  ASP.NET  HTTP   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Strange enduser experience with Liferay, Glassfish and Apache on RedHat

    - by Pete Helgren
    Tried multiple forums to get to the bottom of this. I hope I can get some direction here: Here is the stack I am working with: Red Hat Enterprise Linux Server release 5.6 (Tikanga) Liferay 6.0.6 on Glassfish 3.0.1 MySQL 5.0.77 Apache 2.2.3 The Liferay portal provides a variety of portlets to end users. Static content (web pages), static resources (primarily pdf and mp3 files 1mb - 80mb in size), File upload and download capabilities (primarily 40-60mb mp3 files) and online streaming of those MP3 files. Here is the strange end user experiences: Under normal load: (20-30) users uploading, downloading or streaming files and 20-30 accessing static content (some of it downloads), we see the following: 1) Clicking a link triggers the download of a portion of an MP3 (the portion is a few seconds long). 2) Clicking on a link tiggers the download of the page content rather than rendering. 3) Clicking a link causes the page to dump binary data to the end user rather than the expected content. 4) Clicking a link returns the text of a javascript file rather than rendering the page. Each occurrence is totally random (or appears so). Sometimes it works, sometimes it doesn't. It seems to have no relation to browser or client OS. The strange events seem to occur much more frequently when using an SSL connection rather than regular http. Apache serves as a proxy server only (reverse). It basically passes all the requests through to Glassfish. There isn't any static content proxy served by Apache. We rebuilt the entire stack from scratch and redeployed the portlet wars and still have the same issues. Liferay is running as a single server (not clustered). We disabled mod_cache in Apache. The problems are more frequent as the server load grows. This morning the load is pretty light and we are seeing few problems but the use of the site will grow, particularly tonight around 9pm CST through Wednesday morning. You could try the site (http://preview.bsfinternational.org) during those times and I would expect that you might experience one of the weirdnesses as you randomly click links on the site (https is invoked only when signed in). Again, https seems to exacerbate the issue. This seems very much like a caching issue but I don't know where in the stack to start peeling the onion. Apache? Liferay? Glassfish? MySQL? Maybe even Redhat? We are stumped and most forums we have posted to (LifeRay and Glassfish) have returned very few suggestions. I just need an idea of where to start looking. I understand that we could have a portlet EDIT: Opening the files in a Hex editor that appear to be pages that download rather than render, we see that the first 4000 characters are "junk" and then the "HTTP/1.1 ...." 'normal' header is seen. So something is dumping a jumble of characters up to offset 4000 (when viewing it in a Hex editor). Perhaps a clue? Ideas?

    Read the article

  • Creating Item Templates as Visual Studio 2010 Extensions

    - by maziar
    Technorati Tags: Visual Studio 2010 Extension,T4 Template,VSIX,Item Template Wizard This blog post briefly introduces creation of an item template as a Visual studio 2010 extension. Problem specification Assume you are writing a Framework for data-oriented applications and you decide to include all your application messages in a SQL server database table. After creating the table, your create a class in your framework for getting messages with a string key specified.   var message = FrameworkMessages.Get("ChangesSavedSuccess");   Everyone would say this code is so error prone, because message keys are not strong-typed, and might create errors in application that are not caught in tests. So we think of a way to make it strong-typed, i.e. create a class to use it like this:   var message = Messages.ChangesSavedSuccess; in Messages class the code looks like this: public string ChangesSavedSuccess {     get { return FrameworkMessages.Get("ChangesSavedSuccess"); } }   And clearly, we are not going to create the Messages class manually; we need a class generator for it.   Again assume that the application(s) that intend to use our framework, contain multiple sub-systems. So each sub-system need to have it’s own strong-typed message class that call FrameworkMessages.Get method. So we would like to make our code generator an Item Template so that each developer would easily add the item to his project and no other works would be necessary.   Solution We create a T4 Text Template to generate our strong typed class from database. Then create a Visual Studio Item Template with this generator and publish it.   What Are T4 Templates You might be already familiar with T4 templates. If it’s so, you can skip this section. T4 Text Template is a fine Visual Studio file type (.tt) that generates output text. This file is a mixture of text blocks and code logic (in C# or VB). For example, you can generate HTML files, C# classes, Resource files and etc with use of a T4 template.   Syntax highlighting In Visual Studio 2010 a T4 Template by default will no be syntax highlighted and no auto-complete is supported for it. But there is a fine visual studio extension named ‘Visual T4’ which can be downloaded free from VisualStudioGallery. This tool offers IntelliSense, syntax coloring, validation, transformation preview and more for T4 templates.     How Item Templates work in Visual Studio Visual studio extensions allow us to add some functionalities to visual studio. In our case we need to create a .vsix file that adds a template to visual studio item templates. Item templates are zip files containing the template file and a meta-data file with .vstemplate extension. This .vstemplate file is an XML file that provides some information about the template. A .vsix file also is a zip file (renamed to .vsix) that are open with visual studio extension installer. (Re-installing a vsix file requires that old one to be uninstalled from VS: Tools > Extension Manager.) Installing a vsix will need Visual Studio to be closed and re-opened to take effect. Visual studio extension installer will easily find the item template’s zip file and copy it to visual studio’s template items folder. You can find other visual studio templates in [<VS Install Path>\Common7\IDE\ItemTemplates] and you can edit them; but be very careful with VS default templates.   How Can I Create a VSIX file 1. Visual Studio SDK depending on your Visual Studio’s version, you need to download Microsoft Visual Studio SDK. Note that if you have VS 2010 SP1, you will need to download and install VS 2010 SP1 SDK; VS 2010 SDK will not be installed (unless you change registry value that indicated your service pack number). Here is the link for VS 2010 SP1 SDK. After downloading, Run it and follow the wizard to complete the installation.   2. Create the file you want to make it an Item Template Create a project (or use an existing one) and add you file, edit it to make it work fine.   Back to our own problem, we need to create a T4 (.tt) template. VS-Prok: Add > New Item > General > Text Template Type a file name, ex. Message.tt, and press Add. Create the T4 template file (in this blog I do not intend to include T4 syntaxes so I just write down the code which is clear enough for you to understand)   <#@ template debug="false" hostspecific="true" language="C#" #> <#@ output extension=".cs" #> <#@ Assembly Name="System.Data" #> <#@ Import Namespace="System.Data.SqlClient" #> <#@ Import Namespace="System.Text" #> <#@ Import Namespace="System.IO" #> <#     var connectionString = "server=Maziar-PC; Database=MyDatabase; Integrated Security=True";     var systemName = "Sys1";     var builder = new StringBuilder();     using (var connection = new SqlConnection(connectionString))     {         connection.Open();         var command = connection.CreateCommand();         command.CommandText = string.Format("SELECT [Key] FROM [Message] WHERE System = '{0}'", systemName);         var reader = command.ExecuteReader();         while (reader.Read())         {             builder.AppendFormat("        public static string {0} {{ get {{ return FrameworkMessages.Get(\"{0}\"); }} }}\r\n", reader[0]);         }     } #> namespace <#= System.Runtime.Remoting.Messaging.CallContext.LogicalGetData("NamespaceHint") #> {     public static class <#= Path.GetFileNameWithoutExtension(Host.TemplateFile) #>     { <#= builder.ToString() #>     } } As you can see the T4 template connects to a database, reads message keys and generates a class. Here is the output: namespace MyProject.MyFolder {     public static class Messages     {         public static string ChangesSavedSuccess { get { return FrameworkMessages.Get("ChangesSavedSuccess"); } }         public static string ErrorSavingChanges { get { return FrameworkMessages.Get("ErrorSavingChanges"); } }     } }   The output looks fine but there is one problem. The connectionString and systemName are hard coded. so how can I create an flexible item template? One of features of item templates in visual studio is that you can create a designer wizard for your item template, so I can get connection information and system name there. now lets go on creating the vsix file.   3. Create Template In visual studio click on File > Export Template a wizard will show up. if first step click on Item Template on in the combo box select the project that contains Messages.tt. click next. Select Messages.tt from list in second step. click next. In the third step, you should choose References. For this template, System and System.Data are needed so choose them. click next. write down template name, description, if you like add a .ico file as the icon file and also preview image. Uncheck automatically add the templare … . Copy the output location in clip board. click finish.     4. Create VSIX Project In VS, Click File > New > Project > Extensibility > VSIX Project Type a name, ex. FrameworkMessages, Location, etc. The project will include a .vsixmanifest file. Fill in fields like Author, Product Name, Description, etc.   In Content section, click on Add Content. choose content type as Item Template. choose source as file. remember you have the template file address in clipboard? now paste it in front of file. click OK.     5. Build VSIX Project That’s it, build the project and go to output directory. You see a .vsix file. you can run it now. After restarting VS, if you click on a project > Add > New Item, you will see your item in list and you can add it. When you add this item to a project, if it has not references to System and System.Data they will be added. but the problem mentioned in step 2 is seen now.     6. Create Design Wizard for your Item Template Create a project i.e. Windows Application named ‘Framework.Messages.Design’, but better change its output type to Class Library. Add References to Microsoft.VisualStudio.TemplateWizardInterface and envdte Add a class Named MessagesDesigner in your project and Implement IWizard interface in it. This is what you should write: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.VisualStudio.TemplateWizard; using EnvDTE; namespace Framework.Messages.Design {     class MessageDesigner : IWizard     {         private bool CanAddProjectItem;         public void RunStarted(object automationObject, Dictionary<string, string> replacementsDictionary, WizardRunKind runKind, object[] customParams)         {             // Prompt user for Connection String and System Name in a Windows form (ShowDialog) here             // (try to provide good interface)             // if user clicks on cancel of your windows form return;             string connectionString = "connection;string"; // Set value from the form             string systemName = "system;name"; // Set value from the form             CanAddProjectItem = true;             replacementsDictionary.Add("$connectionString$", connectionString);             replacementsDictionary.Add("$systemName$", systemName);         }         public bool ShouldAddProjectItem(string filePath)         {             return CanAddProjectItem;         }         public void BeforeOpeningFile(ProjectItem projectItem)         {         }         public void ProjectFinishedGenerating(Project project)         {         }         public void ProjectItemFinishedGenerating(ProjectItem projectItem)         {         }         public void RunFinished()         {         }     } }   before your code runs  replacementsDictionary contains list of default template parameters. After that, two other parameters are added. Now build this project and copy the output assembly to [<VS Install Path>\Common7\IDE] folder.   your designer is ready.     The template that you had created is now added to your VSIX project. In windows explorer open your template zip file (extract it somewhere). open the .vstemplate file. first of all remove <ProjectItem SubType="Code" TargetFileName="$fileinputname$.cs" ReplaceParameters="true">Messages.cs</ProjectItem> because the .cs file is not to be intended to be a part of template and it will be generated. change value of ReplaceParameters for your .tt file to true to enable parameter replacement in this file. now right after </TemplateContent> end element, write this: <WizardExtension>   <Assembly>Framework.Messages.Design</Assembly>   <FullClassName>Framework.Messages.Design.MessageDesigner</FullClassName> </WizardExtension>   one other thing that you should do is to edit your .tt file and remove your .cs file. Lines 8 and 9 of your .tt file should be:     var connectionString = "$connectionString$";     var systemName = "$systemName$"; this parameters will be replaced when the item is added to a project. Save the contents to a zip file with same file name and replace the original file.   now again build your VSIX project, uninstall your extension. close VS. now run .vsix file. open vs, add an item of type messages to your project, bingo, your wizard form will show up. fill the fields and click ok, values are replaced in .tt file added.     that’s it. tried so hard to make this post brief, hope it was not so long…   Cheers Maziar

    Read the article

  • Simple Cisco ASA 5505 config issue

    - by Ben Sebborn
    I have a Cisco ASA setup with two interfaces: inside: 192.168.2.254 / 255.255.255.0 SecLevel:100 outside: 192.168.3.250 / 255.255.255.0 SecLevel: 0 I have a static route setup to allow PCs on the inside network to access the internet via a gateway on the outside interface (3.254): outside 0.0.0.0 0.0.0.0 192.168.3.254 This all works fine. I now need to be able to access a PC on the outside interface (3.253) from a PC on the inside interface on port 35300. I understand I should be able to do this with no problems, as I'm going from a higher security level to a lower one. However I can't get any connection. Do I need to set up a seperate static route? Perhaps the route above is overriding what I need to be able to do (is it routing ALL traffic through the gateway?) Any advice on how to do this would be apprecaited. I am configuring this via ASDM but the config can be seen as below: Result of the command: "show running-config" : Saved : ASA Version 8.2(5) ! hostname ciscoasa domain-name xxx.internal names name 192.168.2.201 dev.xxx.internal description Internal Dev server name 192.168.2.200 Newserver ! interface Ethernet0/0 switchport access vlan 2 ! interface Ethernet0/1 ! interface Ethernet0/2 ! interface Ethernet0/3 shutdown ! interface Ethernet0/4 shutdown ! interface Ethernet0/5 shutdown ! interface Ethernet0/6 shutdown ! interface Ethernet0/7 shutdown ! interface Vlan1 nameif inside security-level 100 ip address 192.168.2.254 255.255.255.0 ! interface Vlan2 nameif outside security-level 0 ip address 192.168.3.250 255.255.255.0 ! ! time-range Workingtime periodic weekdays 9:00 to 18:00 ! ftp mode passive clock timezone GMT/BST 0 clock summer-time GMT/BDT recurring last Sun Mar 1:00 last Sun Oct 2:00 dns domain-lookup inside dns server-group DefaultDNS name-server Newserver domain-name xxx.internal same-security-traffic permit inter-interface object-group service Mysql tcp port-object eq 3306 object-group protocol TCPUDP protocol-object udp protocol-object tcp access-list inside_access_in extended permit ip any any access-list outside_access_in remark ENABLES OUTSDIE ACCESS TO DEV SERVER! access-list outside_access_in extended permit tcp any interface outside eq www time-range Workingtime inactive access-list outside_access_in extended permit tcp host www-1.xxx.com interface outside eq ssh access-list inside_access_in_1 extended permit tcp any any eq www access-list inside_access_in_1 extended permit tcp any any eq https access-list inside_access_in_1 remark Connect to SSH services access-list inside_access_in_1 extended permit tcp any any eq ssh access-list inside_access_in_1 remark Connect to mysql server access-list inside_access_in_1 extended permit tcp any host mysql.xxx.com object-group Mysql access-list inside_access_in_1 extended permit tcp any host mysql.xxx.com eq 3312 access-list inside_access_in_1 extended permit object-group TCPUDP host Newserver any eq domain access-list inside_access_in_1 extended permit icmp any any access-list inside_access_in_1 remark Draytek Admin access-list inside_access_in_1 extended permit tcp any 192.168.3.0 255.255.255.0 eq 4433 access-list inside_access_in_1 remark Phone System access-list inside_access_in_1 extended permit tcp any 192.168.3.0 255.255.255.0 eq 35300 log disable pager lines 24 logging enable logging asdm warnings logging from-address [email protected] logging recipient-address [email protected] level errors mtu inside 1500 mtu outside 1500 ip verify reverse-path interface inside ip verify reverse-path interface outside ipv6 access-list inside_access_ipv6_in permit tcp any any eq www ipv6 access-list inside_access_ipv6_in permit tcp any any eq https ipv6 access-list inside_access_ipv6_in permit tcp any any eq ssh ipv6 access-list inside_access_ipv6_in permit icmp6 any any icmp unreachable rate-limit 1 burst-size 1 icmp permit any outside no asdm history enable arp timeout 14400 global (outside) 1 interface nat (inside) 1 0.0.0.0 0.0.0.0 static (inside,outside) tcp interface www dev.xxx.internal www netmask 255.255.255.255 static (inside,outside) tcp interface ssh dev.xxx.internal ssh netmask 255.255.255.255 access-group inside_access_in in interface inside control-plane access-group inside_access_in_1 in interface inside access-group inside_access_ipv6_in in interface inside access-group outside_access_in in interface outside route outside 0.0.0.0 0.0.0.0 192.168.3.254 10 route outside 192.168.3.252 255.255.255.255 192.168.3.252 1 timeout xlate 3:00:00 timeout conn 1:00:00 half-closed 0:10:00 udp 0:02:00 icmp 0:00:02 timeout sunrpc 0:10:00 h323 0:05:00 h225 1:00:00 mgcp 0:05:00 mgcp-pat 0:05:00 timeout sip 0:30:00 sip_media 0:02:00 sip-invite 0:03:00 sip-disconnect 0:02:00 timeout sip-provisional-media 0:02:00 uauth 0:05:00 absolute timeout tcp-proxy-reassembly 0:01:00 timeout floating-conn 0:00:00 dynamic-access-policy-record DfltAccessPolicy aaa authentication telnet console LOCAL aaa authentication enable console LOCAL

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >