Search Results

Search found 593 results on 24 pages for 'transparency'.

Page 12/24 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • College Ratings via the Federal Government

    - by user9147039
    A few weeks back you might remember news about a higher education rating system proposal from the Obama administration. As I've discussed previously, political and stakeholder pressures to improve outcomes and increase transparency are stronger than ever before. The executive branch proposal is intended to make progress in this area. Quoting from the proposal itself, "The ratings will be based upon such measures as: Access, such as percentage of students receiving Pell grants; Affordability, such as average tuition, scholarships, and loan debt; and Outcomes, such as graduation and transfer rates, graduate earnings, and advanced degrees of college graduates.” This is going to be quite complex, to say the least. Most notably, higher ed is not monolithic. From community and other 2-year colleges, to small private 4-year, to professional schools, to large public research institutions…the many walks of higher ed life are, well, many. Designing a ratings system that doesn't wind up with lots of unintended consequences and collateral damage will be difficult. At best you would end up potentially tarnishing the reputation of certain institutions that were actually performing well against the metrics and outcome measures that make sense in their "context" of education. At worst you could spend a lot of time and resources designing a system that would lose credibility with its "customers". A lot of institutions I work with already have in place systems like the one described above. They are tracking completion rates, completion timeframes, transfers to other institutions, job placement, and salary information. As I talk to these institutions there are several constants worth noting: • Deciding on which metrics to measure is complicated. While employment and salary data are relatively easy to track, qualitative measures are more difficult. How do you quantify the benefit to someone who studies in one field that may not compensate him or her as well as another field but that provides huge personal fulfillment and reward is a difficult measure to quantify? • The data is available but the systems to transform the data into actual information that can be used in meaningful ways are not. Too often in higher ed information is siloed. As such, much of the data that need to be a part of a comprehensive system sit in multiple organizations, oftentimes outside the reach of core IT. • Politics and culture are big barriers. One of the areas that my team and I spend a lot of time talking about with higher ed institutions all over the world is the imperative to optimize for student success. This, like the tracking of the students’ achievement after graduation, requires a level or organizational capacity that does not currently exist. The primary barrier is the culture of "data islands" in higher ed, and the need for leadership to drive out the divisions between departments, schools, colleges, etc. and institute academy-wide analytics and data stewardship initiatives that will enable student success. • Data quality is a very big issue. So many disparate systems exist (some on premise, some "in the cloud") that keep data about "persons" using different means to identify them. Establishing a single source of truth about an individual and his or her data is difficult without some type of data quality policy and tools. Good tools actually exist but are seldom leveraged. Don't misunderstand - I think it's a great idea to drive additional transparency and accountability into the system of higher education. And not just at home, but globally. Students and parents need access to key data to make informed, responsible choices. The tools exist to not only enable this kind of information to be shared but to capture the very metrics stakeholders care most about and in a way that makes sense in the context of a given institution's "place" in the overall higher ed panoply.

    Read the article

  • How to restore/change Alt+Tab behaviour/ram usage and a few other things after Ubuntu upgrade from 11.04 to 11.10?

    - by fiktor
    I use Ubuntu for programming. I recently updated it from 11.04 to 11.10. There are some things I don't like in the new version of Unity desktop interface. I don't actually know if it is hard to restore previous behavior or not, and if it is not, where should I look to do that. I know a bit of programming, but I really don't know much about Linux settings. I used to have 3-6 terminal windows and switch between them with Alt+Tab and Shift+Alt+Tab. I liked half-transparent terminal windows, since with them I could open web-page with some instruction in Firefox, press Alt+Tab and type commands in a console window, being able to recognize text on a web-page under it. Now I have problems with my usual work-style because of the following. List of "negative" changes Alt+Tab shows just one icon for all console windows. When I wait some time, it, however, shows all windows, but I don't like to wait. I prefer to remember order of windows and press Alt+Tab as many times as I need to switch to the right window. Alt+Shift+Tab to switch in reverse order doesn't work now. Console windows are not transparent any more. When I don't wait, and switch to this icon, it shows all console windows altogether. So even if they were transparent, I wouldn't be able to see anything below them (I can read something only from the window, which is directly under current one, not a few levels under). When I run a few console windows in Unity I had 740Mb used on Ubuntu 11.04, but I have 1050Mb now. The question is how to make it back to 750-. I really need my memory, since I use my computer to work with 1512Mb of data and I try to save every 10Mb possible (if it doesn't take too much of machine and, more importantly, my time). When I press "The Super key" I have a field to type the name of the program I want to run. But now it sometimes shows this field, but when I'm trying to type nothing happens. Probably, focus is not on the right field. I don't really mean to restore exactly the same behavior, but I want to make my work in Ubuntu 11.10 efficient (at least as efficient as in Ubuntu 11.04). I would be happy if there are some ways to accomplish that. What have I tried I have installed CompizConfig Settings Manager. I have read this question. However enabling "Static Application Switcher" makes Alt+Tab crazy: after enabling it It says about key-binding conflicts with "Ubuntu Unity Plugin"; "Alt+Tab" switching doesn't change, but "Shift+Alt+Tab" now works and shows all windows; Memory usage increases. I have tried turning off Ubuntu Unity Plugin, but this doesn't seem right thing to do, since it seems to turn off all menus, a lot of keystrokes and app-launcher, which usually activates with "The Super key". I have found, that window transparency can be enabled by "Opacity, Brightness and Saturation" plugin from Accessibility. However I don't know if enabling it is the right thing to do (at least it increases memory usage). Update: everything solved but #3: see my own answer below. I have made a separate question about issue #3 (transparency).

    Read the article

  • AllowPartiallyTrustedCallersAttribute exception - unit testing with moq

    - by vdh_ant
    Hi guys I am receiving the following exception when trying to run my unit tests using .net 4.0 under VS2010 with moq 3.1. Attempt by security transparent method 'SPPD.Backend.DataAccess.Test.Specs_for_Core.When_using_base.Can_create_mapper()' to access security critical method 'Microsoft.VisualStudio.TestTools.UnitTesting.Assert.IsNotNull(System.Object)' failed. Assembly 'SPPD.Backend.DataAccess.Test, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is marked with the AllowPartiallyTrustedCallersAttribute, and uses the level 2 security transparency model. Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception. The test I am running is really straight forward and looks something like the following: [TestMethod] public void Can_create_mapper() { this.SetupTest(); var mockMapper = new Moq.Mock<IMapper>().Object; this._Resolver.Setup(x => x.Resolve<IMapper>()).Returns(mockMapper).Verifiable(); var testBaseDa = new TestBaseDa(); var result = testBaseDa.TestCreateMapper<IMapper>(); Assert.IsNotNull(result); //<<< THROWS EXCEPTION HERE Assert.AreSame(mockMapper, result); this._Resolver.Verify(); } I have no idea what this means and I have been looking around and have found very little on the topic. The closest reference I have found is this http://dotnetzip.codeplex.com/Thread/View.aspx?ThreadId=80274 but its not very clear on what they did to fix it... Anyone got any ideas?

    Read the article

  • how to change UITabbar selected color?

    - by RAGOpoR
    according to this post for now, Is apple will also reject this code? and how to implement what apple will approve? @interface UITabBar (ColorExtensions) - (void)recolorItemsWithColor:(UIColor *)color shadowColor:(UIColor *)shadowColor shadowOffset:(CGSize)shadowOffset shadowBlur:(CGFloat)shadowBlur; @end @interface UITabBarItem (Private) @property(retain, nonatomic) UIImage *selectedImage; - (void)_updateView; @end @implementation UITabBar (ColorExtensions) - (void)recolorItemsWithColor:(UIColor *)color shadowColor:(UIColor *)shadowColor shadowOffset:(CGSize)shadowOffset shadowBlur:(CGFloat)shadowBlur { CGColorRef cgColor = [color CGColor]; CGColorRef cgShadowColor = [shadowColor CGColor]; for (UITabBarItem *item in [self items]) if ([item respondsToSelector:@selector(selectedImage)] && [item respondsToSelector:@selector(setSelectedImage:)] && [item respondsToSelector:@selector(_updateView)]) { CGRect contextRect; contextRect.origin.x = 0.0f; contextRect.origin.y = 0.0f; contextRect.size = [[item selectedImage] size]; // Retrieve source image and begin image context UIImage *itemImage = [item image]; CGSize itemImageSize = [itemImage size]; CGPoint itemImagePosition; itemImagePosition.x = ceilf((contextRect.size.width - itemImageSize.width) / 2); itemImagePosition.y = ceilf((contextRect.size.height - itemImageSize.height) / 2); UIGraphicsBeginImageContext(contextRect.size); CGContextRef c = UIGraphicsGetCurrentContext(); // Setup shadow CGContextSetShadowWithColor(c, shadowOffset, shadowBlur, cgShadowColor); // Setup transparency layer and clip to mask CGContextBeginTransparencyLayer(c, NULL); CGContextScaleCTM(c, 1.0, -1.0); CGContextClipToMask(c, CGRectMake(itemImagePosition.x, -itemImagePosition.y, itemImageSize.width, -itemImageSize.height), [itemImage CGImage]); // Fill and end the transparency layer CGContextSetFillColorWithColor(c, cgColor); contextRect.size.height = -contextRect.size.height; CGContextFillRect(c, contextRect); CGContextEndTransparencyLayer(c); // Set selected image and end context [item setSelectedImage:UIGraphicsGetImageFromCurrentImageContext()]; UIGraphicsEndImageContext(); // Update the view [item _updateView]; } } @end

    Read the article

  • Moq broken? Not working with .net 4.0

    - by vdh_ant
    Hi guys I am receiving the following exception when trying to run my unit tests using .net 4.0 under VS2010 with moq 3.1. Attempt by security transparent method 'SPPD.Backend.DataAccess.Test.Specs_for_Core.When_using_base.Can_create_mapper()' to access security critical method 'Microsoft.VisualStudio.TestTools.UnitTesting.Assert.IsNotNull(System.Object)' failed. Assembly 'SPPD.Backend.DataAccess.Test, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is marked with the AllowPartiallyTrustedCallersAttribute, and uses the level 2 security transparency model. Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception. The test I am running is really straight forward and looks something like the following: [TestMethod] public void Can_create_mapper() { this.SetupTest(); var mockMapper = new Moq.Mock<IMapper>().Object; this._Resolver.Setup(x => x.Resolve<IMapper>()).Returns(mockMapper).Verifiable(); var testBaseDa = new TestBaseDa(); var result = testBaseDa.TestCreateMapper<IMapper>(); Assert.IsNotNull(result); //<<< THROWS EXCEPTION HERE Assert.AreSame(mockMapper, result); this._Resolver.Verify(); } I have no idea what this means and I have been looking around and have found very little on the topic. The closest reference I have found is this http://dotnetzip.codeplex.com/Thread/View.aspx?ThreadId=80274 but its not very clear on what they did to fix it... Anyone got any ideas?

    Read the article

  • sIFR displays only the first line of text in Opera when on transparent background

    - by Gary
    I have implemented sIFR for the first time, on a test page. The code I have is below. It works fine in IE7, Firefox, Safari and Chrome, but in Opera only the first line of sIFR-ed text appears when the page first loads and after refreshing the page. But, if I scroll the page, all the text appears! It seems to have to do with transparency, because if I turn transparency off, it works fine. Please can someone help me to make this work? Thanks, Gary <link rel="stylesheet" href="sIFR-print.css" type="text/css" media="print" /> <link rel="stylesheet" href="all.css" type="text/css" media="all" /> <script src="sifr.js" type="text/javascript"></script> <script src="sifr-config.js" type="text/javascript"></script> <script type="text/javascript"> //<![CDATA[ var sIFRfont = { src: 'fontname.swf' }; sIFR.activate(sIFRfont); sIFR.replace(sIFRfont, { css: [ '.sIFR-root { line-height: 1em; font-size: 64px; color: #000000; background-color: blue; text-align: left; font-weight: normal; font-style: normal; text-decoration: none; visibility: hidden; }' ], fitExactly : true, forceClear : true, forceSingleLine : false, selector : 'div.flashtext', transparent : true }); //]]> </script>

    Read the article

  • Make an image transparent in IE to show non-transparent background

    - by Select0r
    Hi, I'm trying to get this thing to work in IE (any version - works in FF, Opera, Safari, Chrome ...): I have a DIV with a background-image. The DIV also contains an image that should be transparent onMouseOver. The expected behaviour now is that the DIV-background would shine through the transparent image (which it does in all browsers but IE). Instead it looks like the image is getting transparent but on a white background, I can't see the DIV's background through the image. Here's some code: <div><a href="#" class"dragItem"><img /></a></div> And some CSS: .dojoDndItemOver { cursor : pointer; filter : alpha(opacity = 50); opacity : 0.5; -moz-opacity : 0.5; -khtml-opacity : 0.5; } .dragItem:hover { filter : alpha(opacity = 30); opacity : 0.3; -moz-opacity : 0.3; -khtml-opacity : 0.3; background : none; } All of this is embedded in a Dojo Drag-n-Drop-system, so dojoDndItemOver will automatically be set to the DIV on MouseOver, dragItem is set to the href around the image (using the same class on the image directly doesn't work at all as IE doesn't support "hover" on other items that href). Any ideas? Or is it an IE-speciality to just "simulate" transparency on images by somehow just greying them out instead of providing real transparency and showing whatever is beneath?

    Read the article

  • Java3D: Problem with order of objects that have a transparent PNG texture

    - by Sebastian
    Hello! Today I tried to program a little fish tank with Java 3D. The fish tank rotates and fishes are placed in it. The fishes in the box are Java 3D Boxes with a PNG picture that has an alpha channel. Without activated transparency the order of the objects is correct. But when I enable it, some fishes in the back come to the front what looks really wrong. I tried NICEST, FASTEST and BLENDED as Transparency Options but I had no effort. Does someone know what the problem could be? Vector3f[] posf = new Vector3f[5]; posf[0] = new Vector3f(-0.22f, -0.1f, -0.2f); posf[1] = new Vector3f(-0.34f, 0.1f, 0.2f); posf[2] = new Vector3f(0.3f, -0.2f, 0.3f); Appearance fischapp = new Appearance(); fischapp.setTransparencyAttributes(new TransparencyAttributes(TransparencyAttributes.NICEST, 1f)); try { fischapp.setTexture(new TextureLoader(ImageIO.read(new File("nemo.png")), this).getTexture()); } catch(IOException exc) { System.out.println(exc.getMessage()); } for(int i = 0; i Thank you!

    Read the article

  • Weird .net 4.0 exception when running unit tests

    - by vdh_ant
    Hi guys I am receiving the following exception when trying to run my unit tests using .net 4.0 under VS2010 with moq 3.1. Attempt by security transparent method 'SPPD.Backend.DataAccess.Test.Specs_for_Core.When_using_base.Can_create_mapper()' to access security critical method 'Microsoft.VisualStudio.TestTools.UnitTesting.Assert.IsNotNull(System.Object)' failed. Assembly 'SPPD.Backend.DataAccess.Test, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is marked with the AllowPartiallyTrustedCallersAttribute, and uses the level 2 security transparency model. Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception. The test I am running is really straight forward and looks something like the following: [TestMethod] public void Can_create_mapper() { this.SetupTest(); var mockMapper = new Moq.Mock<IMapper>().Object; this._Resolver.Setup(x => x.Resolve<IMapper>()).Returns(mockMapper).Verifiable(); var testBaseDa = new TestBaseDa(); var result = testBaseDa.TestCreateMapper<IMapper>(); Assert.IsNotNull(result); //<<< THROWS EXCEPTION HERE Assert.AreSame(mockMapper, result); this._Resolver.Verify(); } I have no idea what this means and I have been looking around and have found very little on the topic. The closest reference I have found is this http://dotnetzip.codeplex.com/Thread/View.aspx?ThreadId=80274 but its not very clear on what they did to fix it... Anyone got any ideas?

    Read the article

  • How to genrate a monochrome bit mask for a 32bit bitmap

    - by Mordachai
    Under Win32, it is a common technique to generate a monochrome bitmask from a bitmap for transparency use by doing the following: SetBkColor(hdcSource, clrTransparency); VERIFY(BitBlt(hdcMask, 0, 0, bm.bmWidth, bm.bmHeight, hdcSource, 0, 0, SRCCOPY)); This assumes that hdcSource is a memory DC holding the source image, and hdcMask is a memory DC holding a monochrome bitmap of the same size (so both are 32x32, but the source is 4 bit color, while the target is 1bit monochrome). However, this seems to fail for me when the source is 32 bit color + alpha. Instead of getting a monochrome bitmap in hdcMask, I get a mask that is all black. No bits get set to white (1). Whereas this works for the 4bit color source. My search-foo is failing, as I cannot seem to find any references to this particular problem. I have isolated that this is indeed the issue in my code: i.e. if I use a source bitmap that is 16 color (4bit), it works; if I use a 32 bit image, it produces the all-black mask. Is there an alternate method I should be using in the case of 32 bit color images? Is there an issue with the alpha channel that overrides the normal behavior of the above technique? Thanks for any help you may have to offer! ADDENDUM: I am still unable to find a technique that creates a valid monochrome bitmap for my GDI+ produced source bitmap. I have somewhat alleviated my particular issue by simply not generating a monochrome bitmask at all, and instead I'm using TransparentBlt(), which seems to get it right (but I don't know what they're doing internally that's any different that allows them to correctly mask the image). It might be useful to have a really good, working function: HBITMAP CreateTransparencyMask(HDC hdc, HBITMAP hSource, COLORREF crTransparency); Where it always creates a valid transparency mask, regardless of the color depth of hSource. Ideas?

    Read the article

  • juqery image fading with tabs

    - by StealthRT
    Hey all, i am trying my best to figure out how to go about doing this: I have 2 tabs. When the page loads tab1 is selected automatically. This shows the tab as 1.0 transparency while tab2 stays at 0.7. Once the user clicks on tab2, tab1 goes to 0.7 transparency and tab2 goes to 1.0. However, i can not seem to get it to do that! Here is my code: function checkTab(theTab) { $('#tab1').fadeTo(250, 0.70); $('#tab2').fadeTo(250, 0.70); if ($("#tabActive").val() == theTab) { $(theTab).fadeTo(250, 1); } } $(document).ready(function() { $('#tab1').hover(function() {$(this).fadeTo(250, 1)}, function() {checkTab('#tab1')}); $('#tab2').hover(function() {$(this).fadeTo(250, 1)}, function() {checkTab('#tab2')}); $('#tab2').fadeTo(250, 0.70); $('#tabActive').val('tab1'); }); </script> <li class="stats"><img src="images/Stats.png" name="nav1" width="70" height="52" id="tab1" onclick="$('#tabActive').val('tab1');" /></li> <li class="cal"><img src="images/cal.png" name="nav1" width="70" height="52" id="tab2" onclick="$('#tabActive').val('tab2');" /></li> <input name="tabActive" id="tabActive" type="text" /> Any help would be great! :) David

    Read the article

  • jquery image fading with tabs

    - by StealthRT
    Hey all, i am trying my best to figure out how to go about doing this: I have 2 tabs. When the page loads tab1 is selected automatically. This shows the tab as 1.0 transparency while tab2 stays at 0.7. Once the user clicks on tab2, tab1 goes to 0.7 transparency and tab2 goes to 1.0. However, i can not seem to get it to do that! Here is my code: <script type="text/javascript"> function checkTab(theTab) { $('#tab1').fadeTo(250, 0.70); $('#tab2').fadeTo(250, 0.70); if ($("#tabActive").val() == theTab) { $(theTab).fadeTo(250, 1); } } $(document).ready(function() { $('#tab1').hover(function() {$(this).fadeTo(250, 1)}, function() {checkTab('#tab1')}); $('#tab2').hover(function() {$(this).fadeTo(250, 1)}, function() {checkTab('#tab2')}); $('#tab2').fadeTo(250, 0.70); $('#tabActive').val('tab1'); }); </script> <li class="stats"><img src="images/Stats.png" name="nav1" width="70" height="52" id="tab1" onclick="$('#tabActive').val('tab1');" /></li> <li class="cal"><img src="images/cal.png" name="nav1" width="70" height="52" id="tab2" onclick="$('#tabActive').val('tab2');" /></li> <input name="tabActive" id="tabActive" type="text" /> Any help would be great! :) David

    Read the article

  • How can I make an event created through Google Calendar's API send an invitation email?

    - by Cebjyre
    I'm trying to create an event through the API and it is mostly working, with the exception that while the new events are being created in the invitees calendars, no emails are being sent. Creating the event from the web interface is pushing the event through, as well as sending the email (except one account that doesn't get any notifications at all, but that's not relevant to my current problem). The event I am trying to push in is: <entry xmlns='http://www.w3.org/2005/Atom' xmlns:gd='http://schemas.google.com/g/2005'> <category scheme='http://schemas.google.com/g/2005#kind' term='http://schemas.google.com/g/2005#event'></category> <title type='text'>test event</title> <content type='text'>content.</content> <gd:transparency value='http://schemas.google.com/g/2005#event.opaque'> </gd:transparency> <gd:eventStatus value='http://schemas.google.com/g/2005#event.confirmed'> </gd:eventStatus> <gd:where valueString='somewhere'></gd:where> <gd:who email="[redacted]" rel='http://schemas.google.com/g/2005#event.attendee' valueString='Me'><gd:attendeeStatus value='http://schemas.google.com/g/2005#event.invited'/></gd:who> <gd:who email="[redacted again]" rel='http://schemas.google.com/g/2005#event.organizer' valueString='Also Me'><gd:attendeeStatus value='http://schemas.google.com/g/2005#event.accepted'/></gd:who> <gd:when startTime='2010-05-18T15:30:00.000+10:00' endTime='2010-05-18T16:00:00.000+10:00'></gd:when> </entry> And when I request event lists I can't see any large difference between events created through the API and through the web interface.

    Read the article

  • Issue with transparent texture on 3D primitive, XNA 4.0

    - by Bevin
    I need to draw a large set of cubes, all with (possibly) unique textures on each side. Some of the textures also have parts of transparency. The cubes that are behind ones with transparent textures should show through the transparent texture. However, it seems that the order in which I draw the cubes decides if the transparency works or not, which is something I want to avoid. Look here: cubeEffect.CurrentTechnique = cubeEffect.Techniques["Textured"]; Block[] cubes = new Block[4]; cubes[0] = new Block(BlockType.leaves, new Vector3(0, 0, 3)); cubes[1] = new Block(BlockType.dirt, new Vector3(0, 1, 3)); cubes[2] = new Block(BlockType.log, new Vector3(0, 0, 4)); cubes[3] = new Block(BlockType.gold, new Vector3(0, 1, 4)); foreach(Block b in cubes) { b.shape.RenderShape(GraphicsDevice, cubeEffect); } This is the code in the Draw method. It produces this result: As you can see, the textures behind the leaf cube are not visible on the other side. When i reverse index 3 and 0 on in the array, I get this: It is clear that the order of drawing is affecting the cubes. I suspect it may have to do with the blend mode, but I have no idea where to start with that.

    Read the article

  • Why can't IE6 shows semi transparent png8 files with alpha filter ?

    - by vvo
    -- read the whole question before answering -- Hi, i work on a big website that had a lot (45000+) of png24 images (with semi transparency). I converted them to png8 and it works very well (a big help on page load time...). The thing is i had to keep png24 files for ie6 users (with alpha filter to have semi transparent pixels) because we all know that we can't use png 8 semi transparent images in IE6 : either the semi transparent pixels will be opaque or completely transparent. I tried to use the alpha image loader filter with png8 images but it just don't work, the pixels are still opaque/completely transparent, no semi transparency. What's the reason it's not working ? Is there a difference for IE when dealing with semi transparent pixels from a png24 or from a png8 ? I couldn't find any information on msdn websites or on stackoverflow... This is crazy... ! DISCLAIMER : i'm not searching for a f**ckin fix IE6 png or sh*t like that, i already know alpha image loader or htc techniques etc, theses all works well with PNG24 files but doesn't work with png8 files.

    Read the article

  • Disabled input text color

    - by Incidently
    Hi The simple HTML below displays differently in Firefox and WebKit-based browsers (I checked in Safari, Chrome and iPhone). In Firefox both border and text have the same color (#880000), but in Safari the text gets a bit lighter (as if it had some transparency applied to it). Can I somehow fix this (remove this transparency in Safari)? UPDATE: Thank you for your answers. I don't need this for my work anymore (instead of disabling, I'm replacing input elements with styled div elements), but I'm still curious why this happens and if there is any way to control this behaviour ... <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title></title> <style type="text/css"> input:disabled{ border:solid 1px #880000; background-color:#ffffff; color:#880000; } </style> </head> <body> <form action=""> <input type="text" value="disabled input box" disabled="disabled"/> </form> </body> </html>

    Read the article

  • Image thumbnails flashes and then disappears in Windows Vista

    - by Hemant
    I have Windows Vista installed on my machine and I am facing this really annoying problem. Whenever I open a folder having images, image thumbnails appear for an instant and they are replaced by standard file icon. Argh... Points to note: Setting "Always show icon, never thumbnail" is unchecked Running Windows Aero theme (with transparency support) Have 4GB of RAM on my machine so memory is not a problem Please can you suggest a solution?

    Read the article

  • Windows 7 Aero Theme when on Battery

    - by Dan Revell
    Windows 7 has the full and basic aero themes that can be changed inside personalization. Windows used to automatically disable Windows Aero (transparency) when I unplug my laptop from the mains. I reset the theme and now it doesn't do this anymore. I can't seem to find how to enable this again. I've checked the advanced power options and control panel but turning back on this feature is eluding me. Any suggestions would be much appreciated.

    Read the article

  • Image thumbnails flashes and then disappears in Windows Vista

    - by Hemant
    I have Windows Vista installed on my machine and I am facing this really annoying problem. Whenever I open a folder having images, image thumbnails appear for an instant and they are replaced by standard file icon. Argh... Points to note: Setting "Always show icon, never thumbnail" is unchecked Running Windows Aero theme (with transparency support) Have 4GB of RAM on my machine so memory is not a problem Please can you suggest a solution?

    Read the article

  • Find out what program is using mirror drivers

    - by Frantumn
    I can't get access to Aero transparency themes in Windows 7. The troubleshooter says that I need to close programs that are using mirror drivers in order for that to work. I've looked through my task manager list of running applications and I can't see any that jump out as the culprit. Is there a program that will tell me what the app causing this issue is? MS Troubleshooter doesn't give me the details.

    Read the article

  • How can I obtain a HBITMAP or HICON from a Direct2D bitmap?

    - by Tom
    Is there any way to obtain a HBITMAP or HICON from a ID2D1Bitmap * using Direct2D? I am using this function to load the bitmap. The reason I ask is because I am creating my level editor tool and would like to draw a PNG image on a standard button control. I know that you can do this using GDI+: HBITMAP hBitmap; Gdiplus::Bitmap b(L"a.png"); b.GetHBITMAP(NULL, &hBitmap); SendMessage(GetDlgItem(hDlg, IDC_BUTTON1), BM_SETIMAGE, IMAGE_BITMAP, (LPARAM)hBitmap); Is there any equivalent, simple solution using Direct2D? If possible, I would like to render multiple PNG files (some with transparency) on a single button.

    Read the article

  • The best terminal emulator for a heavy terminal user?

    - by Noah Goodrich
    I spend a lot of time at the command-line during the workday and at home too since I run Ubuntu exclusively. I've been using the default gnome terminal but I've reached a point where I'd really like to get my terminal tricked out so that my common tasks are as easy as possible. Specifically, I find that I spend of lot of time browsing code in the terminal and working in config files. On my wish list would be: Ability to have multiple screens, tabs, windows (I don't have a preference at this point) that I can easily switch between. Color coding for everything Easy to modify the aesthetics of the terminal (is it vain to want my terminal to look nice?) such as transparency, borders, etc.

    Read the article

  • What’s Your Tax Strategy? Automate the Tax Transfer Pricing Process!

    - by tobyehatch
    Does your business operate in multiple countries? Well, whether you like it or not, many local and international tax authorities inspect your tax strategy.  Legal, effective tax planning is perceived as a “moral” issue. CEOs are being asked to testify on their process of tax transfer pricing between multinational legal entities.  Marc Seewald, Senior Director of Product Management for EPM Applications specializing in all tax subjects and Product Manager for Oracle Hyperion Tax Provisioning, and Bart Stoehr, Senior Director of Product Strategy for Oracle Hyperion Profitability and Cost Management joined me for a discussion/podcast on this interesting subject.  So what exactly is “tax transfer pricing”? Marc defined it this way. “Tax transfer pricing is a profit allocation methodology required to be used by multinational corporations. Specifically, the ultimate goal of the transfer pricing is to ensure that the global multinational pays their fair share of income tax in each of their local markets. Specifically, it prevents companies from unfairly moving profit from ‘high tax’ countries to ‘low tax’ countries.” According to Marc, in today’s global economy, profitability can be significantly impacted by goods and services exchanged between the related divisions within a single multinational company.  To ensure that these cost allocations are done fairly, there are rules that govern the process. These rules ensure that intercompany allocations fairly represent the actual nature of the businesses activity- as if two divisions were unrelated - and provide a clear audit trail of how the costs have been allocated to prove that allocations fall within reasonable ranges.  What are the repercussions of improper tax transfer pricing? How important is it? Tax transfer pricing allocations can materially impact the amount of overall corporate income taxes paid by a company worldwide, in some cases by hundreds of millions of dollars!  Since so much tax revenue is at stake, revenue agencies like the IRS, and international regulatory bodies like the Organization for Economic Cooperation and Development (OECD) are pushing to reform and clarify reporting for tax transfer pricing. Most recently the OECD announced an “Action Plan for Base Erosion and Profit Shifting”. As Marc explained, the times are changing and companies need to be responsive to this issue. “It feels like every other week there is another company being accused of avoiding taxes,” said Marc. Most recently, Caterpillar was accused of avoiding billions of dollars in taxes. In the last couple of years, Apple, GE, Ikea, and Starbucks, have all been accused of tax avoidance. It’s imperative that companies like these have a clear and auditable tax transfer process that enables them to justify tax transfer pricing allocations and avoid steep penalties and bad publicity. Transparency and efficiency are what is needed when it comes to the tax transfer pricing process. Bart explained that tax transfer pricing is driving a deeper inspection of profit recognition specifically focused on the tax element of profit.  However, allocations needed to support tax profitability are nearly identical in process to allocations taking place in other parts of the finance organization. For example, the methods and processes necessary to arrive at tax profitability by legal entity are no different than those used to arrive at fully loaded profitability for a product line. In fact, there is a great opportunity for alignment across these two different functions.So it seems that tax transfer pricing should be reflected in profitability in general. Bart agreed and told us more about some of the critical sub-processes of an overall tax transfer pricing process within the Oracle solution for tax transfer pricing.  “First, there is a ton of data preparation, enrichment and pre-allocation data analysis that is managed in the Oracle Hyperion solution. This serves as the “data staging” to the next, critical sub-processes.  From here, we leverage the Oracle EPM platform’s ability to re-use dimensions and legal entity driver data and financial data with Oracle Hyperion Profitability and Cost Management (HPCM).  Within HPCM, we manage the driver data, define the legal entity to legal entity allocation rules (like cost plus), and have the option to test out multiple, simultaneous tax transfer pricing what-if scenarios.  Once processed, a tax expert can evaluate the effectiveness of any one scenario result versus another via a variance analysis configured with HPCM’s pre-packaged reporting capability known as Oracle Hyperion SmartView for Office.”   Further, Bart explained that the ability to visibly demonstrate how a cost or revenue has been allocated is really helpful and auditable.  “HPCM’s Traceability Maps are that visual representation of all allocation flows that have been executed and is the tax transfer analyst’s best friend in maintaining clear documentation for tax transfer pricing audits. Simply click and drill as you inspect the chain of allocation definitions and results. Once final, the post-allocated tax data can be compared to the GL to create invoices and journal entries for posting to your GL system of choice.  Of course, there is a framework for overall governance of the journal entries, allocation percentages, and reporting to include necessary approvals.” Lastly, Marc explained that the key value in using the Oracle Hyperion solution for tax transfer pricing is that it keeps everything in alignment in one single place. Specifically, Oracle Hyperion effectively becomes the single book of record for the GAAP, management, and the tax set of books. There are many benefits to having one source of the truth. These include EFFICIENCY, CONTROLS and TRANSPARENCY.So, what’s your tax strategy? Why not automate the tax transfer pricing process!To listen to the entire podcast, click here.To learn more about Oracle Hyperion Profitability and Cost Management (HPCM), click here.

    Read the article

  • JavaOne pictures and Community Commentary on JCP Awards

    - by heathervc
    We posted some pictures from JCP related events at JavaOne 2012 on the JCP Facebook page today.  The 2012 JCP Program Award winners and some of the nominees responded to the community recognition of their achievements during some of the JCP events last week.     “Our job on the EC is to balance the need of innovation – so we don’t standardize too early, or too late. We try to find that sweet spot that makes innovation and standardization work together, and not against each other.”- Ben Evans, CEO of jClarity and Executive Committee (EC) representative of the London Java Community, 2012 JCP Member/Participant of the Year Winner“SouJava has been evangelizing the Java platform, promoting the Java ecosystem in Brazil, and contributing to JSRs for several years. It’s very gratifying to have our work recognized, on behalf of many developers and Java User Groups around the world. This really is the work of a large group of people, represented by the few that can be here tonight.”- Michael Santos, representative of SouJava, 2012 JCP Member/Participant of the Year Winner "In the last years Credit Suisse has contributed to the development of Java EE specifications through participation in many customer advisory boards, through statements of requirements for extensions to the core Java related products in use, and active participation in JSRs. Winning the JCP Outstanding Spec Lead Award 2012 is very encouraging for our engagement and also demonstrates the level of expertise and commitment to drive the evolution of Java. Victor Grazi is happy and honored to receive this award." - Susanne Cech Previtali, Executive Committee (EC) representative of Credit Suisse, accepting award for 2012 JCP Outstanding Spec Lead Winner "Managing a JSR is difficult. There are so many decisions to be made and so many good and varied opinions, you never really know if you have decided correctly. The key to success is transparency and collaboration. I am truly humbled by receiving this award, there are so many other active JSRs.” Victor added that going forward in the JCP EC, they would like to simplify and open the process of participation – being addressed in the JCP.Next initiative of the JCP EC. "We would also like to encourage the engagement of universities, professors and students – as an important part of the Java community. While innovation is the lifeblood of our community and industry, without strong standards and compatibility requirements, we all end up in a maze of technology where everything is slightly different and doesn’t quite work with everything else." Victo Grazi, Executive Committee (EC) representative of Credit Suisse, 2012 JCP Outstanding Spec Lead Winner“I am very pleased, of course, to accept this award, but the credit really should go to all of those who have participated in the work of the JCP, while pushing for changes in the way it operates.  JCP.Next represents three JSRs. The first two are done, but the final step, JSR 358, is the complicated one, and it will bring in the lawyers. Just to give you an idea of what we’re dealing with, it affects licensing, intellectual property, patents, implementations not based on the Reference Implementation (RI), the role of the RI, compatibility policy, possible changes to the Technical Compatibility Kit (TCK), transparency, where do individuals fit in, open source, and more.”- Patrick Curran, JCP Chair, Spec Lead on JCP.Next JSRs (JSR 348, JSR 355 and JSR 358), 2012 JCP Most Significant JSR Winner“I’m especially glad to see the JCP community recognize JCP.Next for its importance. The governance work it represents is KEY to moving the Java platform forward and the success of the technology.”- John Rizzo, Executive Committee (EC) representative of Aplix Corporation, JSR Expert Group Member “I am deeply honored to be nominated. I had the privilege to receive two awards on behalf of Expert Groups and Spec Leads two years ago. But this time, I am nominated personally, which values my own contribution to the JCP, and of course, participation in JSRs and the EC work. I’m a fan of Agile Principles and Values Working. Being an Agile Coach and Consultant, I use it for some of the biggest EC Member companies and projects. It fuels my ability to help the JCP become more agile, lean and transparent as part of the JCP.Next effort.” - Werner Keil, Individual Executive Committee (EC) Member, a 2012 JCP Member/Participant of the Year Nominee, JSR Expert Group Member“The JCP ever has been some kind of institution for me,” Markus said. “If in technical doubt, I go there, look for the specifications of the implementation I work with at the moment and verify what I had observed. Since the beginning of my Java journey more than 12 years back now, I always had a strong relationship with the JCP. Shaping the future of a technology by joining the JCP – giving feedback and contributing to the road ahead through individual JSRs – that brings you to a whole new level.”Calling himself, “the new kid on the block,” he explained that for years he was afraid to join the JCP and contribute. But in reality, “Every single one of the big names I meet from the different Expert Groups is a nice person. People you can actually work with,” he says. “And nobody blames you for things you don't know. As long as you are committed and bring what is worth the most: passion, experiences and the desire to make a difference.” - Markus Eisele, a 2012 JCP Member of the Year Nominee, JSR Expert Group MemberCongratulations again to all of the nominees and winners of the JCP Program Awards.  Next year, we will add another award for the group of JUG members (not an entire JUG) that makes the best contribution to the Adopt-a-JSR program.  Let us know if you have other suggestions or improvements.

    Read the article

  • .NET Security Part 3

    - by Simon Cooper
    You write a security-related application that allows addins to be used. These addins (as dlls) can be downloaded from anywhere, and, if allowed to run full-trust, could open a security hole in your application. So you want to restrict what the addin dlls can do, using a sandboxed appdomain, as explained in my previous posts. But there needs to be an interaction between the code running in the sandbox and the code that created the sandbox, so the sandboxed code can control or react to things that happen in the controlling application. Sandboxed code needs to be able to call code outside the sandbox. Now, there are various methods of allowing cross-appdomain calls, the two main ones being .NET Remoting with MarshalByRefObject, and WCF named pipes. I’m not going to cover the details of setting up such mechanisms here, or which you should choose for your specific situation; there are plenty of blogs and tutorials covering such issues elsewhere. What I’m going to concentrate on here is the more general problem of running fully-trusted code within a sandbox, which is required in most methods of app-domain communication and control. Defining assemblies as fully-trusted In my last post, I mentioned that when you create a sandboxed appdomain, you can pass in a list of assembly strongnames that run as full-trust within the appdomain: // get the Assembly object for the assembly Assembly assemblyWithApi = ... // get the StrongName from the assembly's collection of evidence StrongName apiStrongName = assemblyWithApi.Evidence.GetHostEvidence<StrongName>(); // create the sandbox AppDomain sandbox = AppDomain.CreateDomain( "Sandbox", null, appDomainSetup, restrictedPerms, apiStrongName); Any assembly that is loaded into the sandbox with a strong name the same as one in the list of full-trust strong names is unconditionally given full-trust permissions within the sandbox, irregardless of permissions and sandbox setup. This is very powerful! You should only use this for assemblies that you trust as much as the code creating the sandbox. So now you have a class that you want the sandboxed code to call: // within assemblyWithApi public class MyApi { public static void MethodToDoThings() { ... } } // within the sandboxed dll public class UntrustedSandboxedClass { public void DodgyMethod() { ... MyApi.MethodToDoThings(); ... } } However, if you try to do this, you get quite an ugly exception: MethodAccessException: Attempt by security transparent method ‘UntrustedSandboxedClass.DodgyMethod()’ to access security critical method ‘MyApi.MethodToDoThings()’ failed. Security transparency, which I covered in my first post in the series, has entered the picture. Partially-trusted code runs at the Transparent security level, fully-trusted code runs at the Critical security level, and Transparent code cannot under any circumstances call Critical code. Security transparency and AllowPartiallyTrustedCallersAttribute So the solution is easy, right? Make MethodToDoThings SafeCritical, then the transparent code running in the sandbox can call the api: [SecuritySafeCritical] public static void MethodToDoThings() { ... } However, this doesn’t solve the problem. When you try again, exactly the same exception is thrown; MethodToDoThings is still running as Critical code. What’s going on? By default, a fully-trusted assembly always runs Critical code, irregardless of any security attributes on its types and methods. This is because it may not have been designed in a secure way when called from transparent code – as we’ll see in the next post, it is easy to open a security hole despite all the security protections .NET 4 offers. When exposing an assembly to be called from partially-trusted code, the entire assembly needs a security audit to decide what should be transparent, safe critical, or critical, and close any potential security holes. This is where AllowPartiallyTrustedCallersAttribute (APTCA) comes in. Without this attribute, fully-trusted assemblies run Critical code, and partially-trusted assemblies run Transparent code. When this attribute is applied to an assembly, it confirms that the assembly has had a full security audit, and it is safe to be called from untrusted code. All code in that assembly runs as Transparent, but SecurityCriticalAttribute and SecuritySafeCriticalAttribute can be applied to individual types and methods to make those run at the Critical or SafeCritical levels, with all the restrictions that entails. So, to allow the sandboxed assembly to call the full-trust API assembly, simply add APCTA to the API assembly: [assembly: AllowPartiallyTrustedCallers] and everything works as you expect. The sandboxed dll can call your API dll, and from there communicate with the rest of the application. Conclusion That’s the basics of running a full-trust assembly in a sandboxed appdomain, and allowing a sandboxed assembly to access it. The key is AllowPartiallyTrustedCallersAttribute, which is what lets partially-trusted code call a fully-trusted assembly. However, an assembly with APTCA applied to it means that you have run a full security audit of every type and member in the assembly. If you don’t, then you could inadvertently open a security hole. I’ll be looking at ways this can happen in my next post.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >