Search Results

Search found 779 results on 32 pages for 'uiimage'.

Page 26/32 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • iPhone - UIView Animation not looping correctly

    - by Robert
    Hey all, got a little problem here and can't figure out what I am doing wrong. I am trying to animate a UIView up and down repeatedly. When I start it, it goes down correctly and then back up but then immediately shoots to the "final" position of the animation. Code is as follows: UIImageView *guide = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"image.png"]]; guide.frame = CGRectMake(250, 80, 30, 30); [self.view addSubview:guide]; [UIView beginAnimations:nil context:nil]; [UIView setAnimationCurve:UIViewAnimationCurveEaseInOut]; [UIView setAnimationDuration:2]; [UIView setAnimationRepeatAutoreverses:YES]; [UIView setAnimationBeginsFromCurrentState:YES]; guide.frame = CGRectMake(250, 300, 30, 30); [UIView setAnimationRepeatCount:10]; [UIView commitAnimations]; Thanks in advance!

    Read the article

  • Save UIView's representation to file.

    - by fish potato
    What is the easiest way to save UIView's representation to file? My solution is, UIGraphicsBeginImageContext(someView.frame.size); [someView drawRect:someView.frame]; UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); NSString* pathToCreate = @"sample.png"; NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)]; [imageData writeToFile:pathToCreate atomically:YES]; but it seems tricky, and I think there must be more efficient way to do this.

    Read the article

  • Setting an ASIHTTPRequest post to a SimpleHTTPServer Python server?

    - by Rob
    I am working on a project (that i will not be releasing to the app store - just for fun) that will upload an image via an HTTP Post request from my iPhone to a server that I have running the Python script SimpleHTTPServer. I have successfully used the ASIHTTP APIs in the past for text strings, but can't for the life of me figure out how to upload an image. I have tried all of the following: NSString *path = [[NSBundle mainBundle] pathForResource:@"Image" ofType:@".png"]; [request setFile:@"Image.png" forKey:@"file"]; [request setFile:path forKey:@"file"]; [request setFile:path withFileName:@"Image.png" andContentType:@"image/jpeg" forKey:@"file"]; [request setData:[UIImage imageNamed:@"Image.png"] withFileName:@"Image.png" andContentType:@"Image" forKey:@"file"]; Any thoughts on where i could be going wrong?

    Read the article

  • Need example of how to create/manipulate image pixel data with iPhone SDK

    - by whiskeyspider
    Looking for a simple example or link to a tutorial. Say I have a bunch of values stored in an array. I would like to create an image and update the image data from my array. Assume the array values are intensity data and will be updating a grayscale image. Assume the array values are between 0 and 255 -- or that I will convert it to that range. This is not for purposes of animation. Rather the image would be updated based on user interaction. This is something I know how to do well in Java, but am very new to iPhone programming. I've googled some information about CGImage and UIImage -- but am confused as to where to start. Any help would be appreciated.

    Read the article

  • iPhone: Add background button to view when UITableView has no cells

    - by Nic Hubbard
    I have a UITableViewController, when there is no data to populate the UITableView, I want to add a button, which uses an image. So, rather than the user seeing a tableview with no records, they will see an image that says, "No records have been added, Tap to add one", then they click and we create a new one. I assumed I would just hide the UITableView, then create the button, but I never see the button. Here I am using: if ([[fetchedResultsController sections] count] == 0) { self.tableView.hidden = YES; // Create button w/ image UIButton * btn = [UIButton buttonWithType:UIButtonTypeRoundedRect]; btn.frame = CGRectMake(0, 0, 100, 50); [btn setImage:[UIImage imageNamed:@"no-rides.png"] forState:UIControlStateNormal]; [self.view addSubview:btn]; } Ideas on why I would never see the button? When I show this view, it seems to have a transparent background for a second, then changes white...

    Read the article

  • Using NSThread to solve waiting for image from URL on the iPhone

    - by james.ingham
    So I have the following code in a method which I want to set a UIImageView image to that of one from an online source: [NSThread detachNewThreadSelector:@selector(loadImage) toTarget:self withObject:nil]; Then in the method called by the thread I have this: - (void) loadImage { NSURL *url = [NSURL URLWithString:logoPath]; // logoPath is an NSString with path details NSData *data = [NSData dataWithContentsOfURL:url]; logoImage.image = [UIImage imageWithData:data]; } This works great however I get many warnings within the Debugger Console along the lines of: 2010-05-10 14:30:14.052 ProjectTitle[2930:633f] * _NSAutoreleaseNoPool(): Object 0x169d30 of class NSHTTPURLResponse autoreleased with no pool in place - just leaking This occurs many times each time I call the new thread and then eventually, under no pattern, after calling a few of these threads I get the classic 'EXC_BAD_ACCESS' run-time error. I understand that this is happening because I'm not retaining the object but how can I solve this with the code in 'loadImage' shown above? Thanks

    Read the article

  • Getting invalid context errors

    - by Andrew
    I don't have much code thus far, only this to start: UIGraphicsBeginImageContextWithOptions(bounds.size, NO, 0); CGContextRef context = UIGraphicsGetCurrentContext(); CGMutablePathRef outerPath; CGMutablePathRef highlightPath; CGRect outerRect = rectForRectWithInset(bounds, 1); CGRect highlightRect = CGRectMake(outerRect.origin.x, outerRect.origin.y + 1, outerRect.size.width, outerRect.size.height); And then the problematic bit, which when commented out, the error goes away: CGContextSaveGState(context); CGContextAddPath(context, highlightPath); CGContextSetFillColorWithColor(context, [[UIColor colorWithWhite:1.0 alpha:0.05]CGColor]); CGContextFillPath(context); CGContextRestoreGState(context); Below that is simply: UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext();

    Read the article

  • iPhone WebApp Question

    - by Henry D'Andrea
    I have this code- /** Save the web view as a screenshot. Currently only supports saving to the photo library. / - (void)saveScreenshot:(NSArray)arguments withDict:(NSDictionary*)options { CGRect screenRect = [[UIScreen mainScreen] bounds]; CGRect imageRect = CGRectMake(0, 0, CGRectGetWidth(screenRect), CGRectGetHeight(screenRect)); UIGraphicsBeginImageContext(imageRect.size); [webView.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); UIImageWriteToSavedPhotosAlbum(viewImage, self, nil, nil); UIAlertView *alert= [[UIAlertView alloc] initWithTitle:nil message:@"Image Saved" delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; } This is for saving whatever you drew in my app. How would I add the button for this in the HTML code. How do i call from it?

    Read the article

  • Why are all my masked views unmasked in my view snapshot?

    - by mystify
    I'm taking a snapshot of an view. This view has got some subviews which have layer masks applied to them. For some reason, those masks take no effect in the snapshot and the masked parts are completely visible. UIGraphicsBeginImageContext(theView.frame.size); [theView.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); I assume this is a bug in the framework. But maybe it's not? Did I do anything wrong here?

    Read the article

  • UILabels text disappears when animating

    - by Wilhelm Michaelsen
    I have this code: - (void)my_button_tapped { if (my_button.tag == 0) { [UIView beginAnimations:nil context:nil]; [UIView setAnimationDuration:0.5]; my_label.frame = CGRectMake(450, 455, 200, 20); [UIView commitAnimations]; [my_button setBackgroundImage:[UIImage imageNamed:@"check.png"] forState:UIControlStateNormal]; my_button.tag = 1; } else { [UIView beginAnimations:nil context:nil]; [UIView setAnimationDuration:0.5]; my_label.frame = CGRectMake(450, 455, 0, 20); [UIView commitAnimations]; [my_button setBackgroundImage:nil forState:UIControlStateNormal]; my_button.tag = 0; } } When I tap my_button first time the label is expanded into 200px width, when I press the button again the label decreases to 0px width but immediately at button press the text disappears. What's wrong?

    Read the article

  • How to manage orientation in iPad app

    - by Annie
    I have added a image on navigationcontroller in appdelagte class. and set yes in - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { loginViewController = [[LoginViewController alloc]initWithNibName:@"LoginViewController" bundle:nil]; [self.window addSubview:_navController.view]; _navController.navigationBarHidden = NO; navimage = [[UIImageView alloc] init]; navimage.frame = CGRectMake(300, 18, 177, 47); navimage.image = [UIImage imageNamed: @"logo.png"]; [_navController.view addSubview:navimage]; [self.window makeKeyAndVisible]; return YES; } - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation{ return YES; } But navigation image position is not changing. The frame remains same in both modes. Please give me idea how it will be solved in the case of navigation controller image.

    Read the article

  • How to check image during animation

    - by TomTom
    I have set up an animation in the following way (self is an UIImageView, myImages an Array of UIImages): self.animationImages = myImages; self.animationDuration = 50; self.animationRepeatCount = 0; [self startAnimating]; During the animation I'd like to check the current image. I tried it the following way if([self image]==[UIImage imageNamed:@"image1.png"]); but this does not work. Is there a straight forward way for this? Can I keep track of which image is shown during the animation?

    Read the article

  • iPhone: How can you draw a piece of an image

    - by Mark
    Code sample - (void)drawRect:(CGRect)rect { [super drawRect:rect]; CGContextRef context = UIGraphicsGetCurrentContext(); CGContextSaveGState(context); CGContextTranslateCTM(context, 0, self.frame.size.height); CGContextScaleCTM(context, 1.0, -1.0); CGContextDrawImage(context, CGRectMake(0.0, 0.0, self.frame.size.width, self.frame.size.height), [UIImage imageNamed:@"sample.png"].CGImage); CGContextRestoreGState(context); } ================ I would like to copy a certain rect within an image to the context, so not the entire image is drawn but just a piece of the image. Does anyone have a solution for this? I can't find anything on google nor the documentation. I know there are alternatives like: 1. Create a UIView with clipping and then just position the UIImageView within it. 2. Create the UIImageView within the UIScrollView and use content offset. But I think those are lame...

    Read the article

  • How i store the images pixels in matrix form?

    - by Rajendra Bhole
    Hi, I developing an application in which the pixelize image i want to be store in matrix format. The code is as follows. struct pixel { //unsigned char r, g, b,a; Byte r, g, b; int count; }; (NSInteger) processImage1: (UIImage*) image { // Allocate a buffer big enough to hold all the pixels struct pixel* pixels = (struct pixel*) calloc(1, image.size.width * image.size.height * sizeof(struct pixel)); if (pixels != nil) { // Create a new bitmap CGContextRef context = CGBitmapContextCreate( (void*) pixels, image.size.width, image.size.height, 8, image.size.width * 4, CGImageGetColorSpace(image.CGImage), kCGImageAlphaPremultipliedLast ); NSLog(@"1=%d, 2=%d, 3=%d", CGImageGetBitsPerComponent(image), CGImageGetBitsPerPixel(image),CGImageGetBytesPerRow(image)); if (context != NULL) { // Draw the image in the bitmap CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage); NSUInteger numberOfPixels = image.size.width * image.size.height; I confusing about how to initialize the 2-D matrix in which the matrix store data of pixels.

    Read the article

  • Pass a class as a parameter?

    - by JuBu1324
    I have been lead to believe that it is possible to pass a class as a method parameter, but I'm having trouble implementing the concept. Right now I have something like: - (id)navControllerFromView:(Class *)viewControllerClass title:(NSString *)title imageName:(NSString *)imageName { viewControllerClass *viewController = [[viewControllerClass alloc] init]; UINavigationController *thisNavController = [[UINavigationController alloc] initWithRootViewController: viewController]; thisNavController.tabBarItem = [[UITabBarItem alloc] initWithTitle: title image: [UIImage imageNamed: imageName] tag: 3]; return thisNavController; } and I call it like this: rootNavController = [ self navControllerFromView:RootViewController title:@"Contact" imageName:@"my_info.png" ]; What's wrong with this picture?

    Read the article

  • UITableView scrollable background , UITableViewCell transparency

    - by f0rz
    Hi! In my UITableView I have custom cells. My UITableView have a repeatable background myTableView.backgroundColor = [UIColor colorWithPatternImage: [UIImage imageNamed: @"backg.png"]]; I set my cells to have their background (clear); UIView *backView = [[[UIView alloc] initWithFrame:CGRectZero] autorelease]; backView.backgroundColor = [UIColor clearColor]; cell.backgroundView = backView; This makes the cell have the same background as the TableView.. Problem is; That seems that every cell loads myTableView.backgroundColor once again. And the background are not being repeatble as it should. I want to have the cells totally with no background at all, instead they loads up myTableView.backgroundColor once again. Here is one example. Can anyone help me?

    Read the article

  • UIGraphicsBeginImageContext question in Objective C

    - by Henry D'Andrea
    I need the UIGraphicsBeginImageContext(self.view.frame.size); changed to where the .frame part pulls from webView - (void) save { UIGraphicsBeginImageContext(self.view.frame.size); [self.view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil); NSLog(@"TEST"); } WEBVIEW CODE: -(BOOL) webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)ntype { NSLog(@"Scheme: %@", request.URL.scheme); if ([request.URL.scheme isEqualToString:@"save"]) { [self save]; } return true; }

    Read the article

  • iPhone SDK - How to display a photo taken with the camera inside a UINavigationController?

    - by dan
    This is my code so far: /* class: myViewController @interface myViewController: UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate> */ - (IBAction) getPicture { UIImagePickerController * picker = [[UIImagePickerController alloc] init]; picker.delegate = self; picker.sourceType = UIImagePickerControllerSourceTypeCamera; [self presentModalViewController:picker animated:YES]; } - (void) imagePickerController:(UIImagePickerController *)thePicker didFinishPickingMediaWithInfo:(NSDictionary *)imageInfo { [[thePicker parentViewController] dismissModalViewControllerAnimated:YES]; UIImage *img = [imageInfo objectForKey:@"UIImagePickerControllerOriginalImage"]; self.myImageView.image = img; } So basically I'm trying to get a photo from the iPhone camera and display it in a UIImageView. This works perfectly fine as long the class myViewController is displayed as a standalone view. If I'm putting the View inside a UINavigationController the UIImageView won't display the image after taking one with the camera. But if I choose a picture from the library everything is fine again. So why does the UIImageView won't display a image taken with the camera inside a UINavigationController?

    Read the article

  • Adding a subview larger than cellHeight to a UITableViewCell?

    - by MathieuK
    I'm trying to add a subview to a UITableViewCell and the design that I'm working from demands that this particular subview (an image) needs to be larger than the actual UITableViewCell and thus partly overlap its siblings. So I've set up my table cell, generated my image and added it to the cell's contentView: // rowHeight for the UITableView is 45.0f UIImage *image = [self createCellThumbnail: someImage]; UIImageView *thumbView = [[UIImageView alloc] initWithFrame: CGRectMake(150, -5, 55,55)]; thumbView.transform = CGAffineTransformMakeRotation(0.1f); thumbView.image = image; cell.clipsToBounds = NO; cell.contentView.clipsToBounds = NO; [cell.contentView addSubview: thumbView]; While the image will 'overflow' into the cell below it, the top of the image is always clipped, as demonstrated here: http://imgur.com/WDsAx . Does anyone know if what I'm trying to do is possible with the current approach or should I just figure out a way to draw these images onto the UITableView after all the cells are drawn (it's a non-scrollable tableview, so that would work and be fairly easy).

    Read the article

  • Documented process for using facebook connect for the iPhone to upload photos

    - by Corey Floyd
    After looking I did come accross this post on the facebook forums: link They are feeding the facebook object a UIImage. That seems logical, but where is this documented? The API documentation is generalized to all platforms. Where are the iPhone specific requirements for arguments and their data types? Thanks **Update***** I still have not came across any API docs pertaining to Cocoa. I did, however, gather the information I needed by piecing together forum information, Facebook sample code, and some glue. Hopefully they'll issue something a little more concrete over the next few months.

    Read the article

  • How do I use a contact's photo in a table view cell?

    - by Andy
    I've got an app that has a table view that displays contact information in each row. I'd like to use the contact's stored image (if there is one available) as the image on the left-hand side of the cell. I've found some sketchy sample code in Apple's documentation, but the address book references some kind of weird data type (CFDataRef) that doesn't appear to correspond to the data types referenced in the table view programming guide (mainly UIImage). This seems like a pretty basic task, but I can't seem to wrap my head around it. Thanks in advance for any help you can offer.

    Read the article

  • simple obj-c naming question

    - by Highstead
    This question is more to figure out how to look up classes and objects in objective-c but i lack the knowledge to figure out how to look this up so i pose the question here. In .Net if i had a MyObject.MyValue the MyValue would be called a property, and I could look this up in MSDN. In java i would check the javadocs online (and that property would have to be a value). With objective-c that is called a ? and if i wanted to look it up i would look where? Example: //Object.??? UIImage.backgroundColor = [UIColor blueColor];

    Read the article

  • Make a call in Objective C help!

    - by Henry D'Andrea
    I need to make a call where it says add call here. Can someone help? (BOOL) webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)ntype { NSLog(@"Scheme: %@", request.URL.scheme); if ([request.URL.scheme isEqualToString:@"save"]) { //Add Call here } return true; } FROM this code- (void) save { UIGraphicsBeginImageContext(self.view.frame.size); [self.view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil); NSLog(@"TEST"); }

    Read the article

  • Visual artifacts on UIView rotation with tiled background image.

    - by Halbanonym
    I have an iPad app with a standard UIViewController/UIView setup - all rotations are allowed. The UIView draws some tiled image as background (the tile is 256*256 pixels): - (void)drawRect:(CGRect)rect { [[UIImage imageNamed: @"Background.png"] drawAsPatternInRect: rect]; } When I turn my iPad I can see that during the rotation the image pattern of the original orientation is scaled to fit the new orientation. Then - immediately after the animation is finished - the view redraws its background pattern with the final configuration which is unscaled. The switching from a scaled to an unscaled pattern looks a bit ugly. Is there a way to circumvent (or hide) this strecthing of the background pattern?

    Read the article

  • UIButton setBackgroundImage consumes doesn't release memory?

    - by just_another_coder
    My UIButton has it's background image set like this: [myImageButton setBackgroundImage:[UIImage imageNamed:myImageName] forState:UIControlStateNormal]; myImageButton is a retained property of the class, and is setup with IB. No where else is it accessed in the app. myImageName is simply an NSString with a filename like @"myImage_number_1.png" I am loading large images, 1024 x 1024 in size. When the view is shown, it changes the image with the above statement, then available memory decreases. After I see the view about 7-9 different times, the app crashes with a memory warning. I thought the method would free up the loaded image. The view itself is only instantiated and allocated one time, so it's not in the retain/release cycle if the view controller. Is there something about this setBackgroundImage I don't know that causes it to not release memory?

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32  | Next Page >