Archive for the ‘Android’ Category

Root Nexus One 2.3.6

December 17, 2011

Used this “DooMLoRD Easy Rooting Toolkit v3.0 (using zergRush exploit)” (Windows) and rooted my phone in less than 3 minutes. Also downloaded this  ROM Toolbox .

In the future if I find more time and energy, I may play around with some custom ROMs like CyanogenMod. I would probably be most interested in  getting them running on an emulator; It would be nice if someone put the compiled system.img files somewhere so that I don’t need to build from the source;).

Advertisements

Facebook for Android API 101

December 11, 2011

This morning I got to sit down and had a jump start on my first Facebook/mobile adventure! It seems I was able to follow through most part of their documentation (Android Tutorial and Graph API).

When I tried to get my app signature (Android key hash) using the java keytool, my Windows XP said it couldn’t find openssl command. I had to download both Visual C++ 2008 Redistributables and Win32 OpenSSL v1.0.0e from here.

I wasn’t able to save/get the access token from the preference file and I had to manually set the access token in the app. I got the token from the Graph API Explorer.

Escaping Angle Brackets in XML in Android

November 20, 2011

Either specify those characters like this:

< = &lt; > = &gt; 

Or use a CDATA section and put those characters inside:

<![CDATA[<]]> <![CDATA[>]]>

Here.

Accessing resources in Android

November 20, 2011

In most cases, you need to get hold of the Context object in order to access the resources in the app (the local package). Because Activity extends Context, you can get all the resources accessible from an instance of Context. For example:

Context.getResources().getString(R.string.resName);

or simply

Context.getString(R.string.resName);

If you want to access the resources in the local package from an object that doesn’t inherit from Context, you need to pass the context object to it somehow. On the other hand, you can access the resources provided by the system (Android) without any context object:

final Resources r = Resources.getSystem();
r.getString(android.R.string.untitled);

Here.

Android: Sqlite long to datetime

August 21, 2011

If you store time using android.text.format.Time.toMillis, it will end up in the sqlite as a 13 chars “text/numeric”, something like “1311123600000”. You can do the following to make them human readable:

SELECT datetime(start/1000, 'unixepoch') as start, datetime(end/1000, 'unixepoch') as end from countdowns

//output: 2011-07-20 01:00:00 (I think this means 12:00 AM but I am not sure)

Android AppWidget 101

May 23, 2011

0. To create an appWidget, you would need 4 things: in your application manifest xml, declare the appWidget Provider (in a <receiver> tag, see code below);  a layout xml for your remote views (in “res/layout”); an app widget provider xml (in “res/xml” with an <appwidget-provider> tag); and optionally implementing AppWidgetProvider for the logic in your app.

<receiver android:name=".TestAppWidgetProvider" android:enabled="true">
<intent-filter>
<action android:name="android.appwidget.action.APPWIDGET_UPDATE"/>
</intent-filter>
<meta-data android:resource="@xml/widget" android:name="android.appwidget.provider"></meta-data>
</receiver>

1. You can explicitly update the appWidgets using AppWidgetManager and not rely on the timer mechanism. This means your app can register with all kinds of listeners and update your appWidgets based on the corresponding events when they occur:

final AppWidgetManager gm = AppWidgetManager.getInstance(context);
gm.updateAppWidget( new ComponentName("com.maohao.android.myappnamespace",
"com.maohao.android.myappnamespace.MyAppWidgetProvider"), views);

2. Each time a new appWidget is added to the home screen,

AppWidgetProvider.onReceive > onUpdate

also get called; each time an appWidget is deleted from the home screen,

onReceive > onDelete

get called.

When the first appWidget is added to home,

onReceive > onEnabled > onReceive > onUpdate

anddroid.appWidget.action.APPWIDGET_ENABLED is the onReceived intent action;

When the last appWidget is removed from home,

 onReceive - onDeleted > onReceive - onDisabled

anddroid.appWidget.action.APPWIDGET_DELETED for onDeleted and anddroid.appWidget.action.APPWIDGET_DISABLED for onDeleted get called.

You don’t seem to explicitly declare in your manifest xml for your appWidgetProvider to receive APPWIDGET_DELETED/APPWIDGET_DISABLED actions but you need to declare an intent filter for ACTION_APPWIDGET_UPDATE action.

Tips of Eclipse for Android

May 30, 2010

1. Override/Implement methods (actually it declares those methods for you to implement:))

In the “Package Explorer” -> Right click the class ->

Or simply “Alt+Shift+S” to open the “Source” menu,  Or choose “Source” from the menu->

then choose “Override/Implement Methods”

ListView and ListActivity Demo

November 26, 2009

This is a slightly modified version at apiDemo. It demonstrates how you select multiple items on a ListView and display the results on a TextView.

main.xml layout (Note that the ListView has “choiceMode” set as “multipleChoice” and that the ListView has an id of “@android:id/list”):

<?xml version=”1.0″ encoding=”utf-8″?>
<LinearLayout xmlns:android=”http://schemas.android.com/apk/res/android&#8221;
android:orientation=”vertical”
android:layout_width=”fill_parent”
android:layout_height=”fill_parent”
>
<TextView
android:layout_width=”fill_parent”
android:layout_height=”wrap_content”
android:text=” ”
android:id=”@+id/selection”
/>
<ListView
android:id=”@android:id/list”
android:choiceMode=”multipleChoice”
android:layout_width=”fill_parent”
android:layout_height=”wrap_content”/>
</LinearLayout>

Main java class:

package com.mh.android.test;

import android.app.ListActivity;
import android.os.Bundle;
import android.util.Log;
import android.util.SparseBooleanArray;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.ListView;
import android.widget.TextView;

public class ListAdapterTest extends ListActivity {

String[] items= {“lorem”, “ipsum”, “dolor”, “sit”, “amet”,
“consectetuer”, “adipiscing”, “elit”, “morbi”, “vel”,
“ligula”, “vitae”, “arcu”, “aliquet”, “mollis”,
“etiam”, “vel”, “erat”, “placerat”, “ante”,
“porttitor”, “sodales”, “pellentesque”, “augue”, “purus”};

TextView selection;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
setListAdapter(new ArrayAdapter<String>(this,
android.R.layout.simple_list_item_multiple_choice,
items));
selection = (TextView) findViewById(R.id.selection);
}
@Override
protected void onListItemClick(ListView parent, View v, int position, long id) {
// Clear the TextView before we assign the new content.
selection.setText(” “);
// get array of booleans for which positions are selected in the items.
// This SparseBooleanArray object contains an array of boolean values paired with keys,
// which can be accessed using valueAt(index) and keyAt(index) respectively.
SparseBooleanArray chosen = parent.getCheckedItemPositions();
for(int i=0; i<chosen.size(); i++) {
Log.d(“selection”, “index: “+i+”; key: “+chosen.keyAt(i)+”; value: “+chosen.valueAt(i)+”; “+items[chosen.keyAt(i)]);
// if the item is selected by the user, we display it on the TextView.
if(chosen.valueAt(i)) {
selection.append(items[chosen.keyAt(i)]+” “);
}
}
}
}

Focus?

November 26, 2009

I am always confused by the word “focus” (in the context of user interface, which embarrasses me very much since I am making a living as a “user interface designer”.)  Now I think I am a little clear about the meaning of it so just a note for myself.

In the WIMP environment, for a pointing device such as a mouse, it’s always clear where the “focus” is: it’s always where you point your mouse towards. In this kind of paradigm, the spatial input is continuous.  But in the context of a non-desktop environment where you use a D-pad; or in the WIMP if you use the keyboard as your current input, it’s not always clear where the input is. The concept of “focus” is used to indicate where the spatial input is currently in the GUI. If a UI element has the focus, I guess it depends on what kind of control it is, it can behave slightly differently. For a button, it means it will react as if it “received a mouse click” if the user presses the enter key; For an editable text field, it means it has a caret cursor inside it and is ready to take keyboard input (if you type something they will go into the text field in focus); For a hyperlink, it does the same thing if you press the enter key as if you click directly on it (normally load the page it links to:).)

Note that the visual feedback and behavior for the keyboard input/”discrete spatial input” paradigm is normally different from that of the mouse input/”continuous spatial input” (and also different from that of the touch input — more on this later). In the picture below, it shows the “Go” button is in focus while the “Search” button has a mouse hover.

Fig 1.

Focus on tab vs. mouse hover

Focus on tab vs. mouse hover

Here is some notes:

1. A UI element can be “disabled” but still receives focus. See figure 2-4.

Fig 2.

"Ok" button is disabled

"Ok" button is disabled

Fig 3.

"Cancel" button is in focus

"Cancel" button is in focus

Fig 4.

"Ok" button is in focus

"Ok" button is in focus

2.  In MSDN glossary, it says “input focus” means “The location where the user is currently directing input. Note that just because a location in the UI is highlighted does not necessarily mean this location has input focus.” I guess the scenario is when you are hovering the mouse over an UI element?

3. Touch screen paradigms. As in iPhone, Android opt not to deal focus (showing highlight or any other visuals) in “touch mode“. An interesting reflection is about potentially different interaction paradigms for capacitive touch vs. resistive touch. Can the level of pressure be reflected as whether the UI receives the focus as opposed to take a “click”? Will the “80 percent” of the users be surprised and/or confused?

4. Design guideline for “Input focus location” at MSDN.

5. Besides the keyboard, D-pad, mouse and touch, there is another scenario where the focus can be acquired: that is by code. When a focus is acquired programmatically, the UI behaves as if it’s acquired by a user tabbing the keyboard/D-pad.

6. It’s safe to say for now, the rationale for designing co-existent interaction paradigms is that you should treat the UI as if the user has multiple input/pointing devices at hand (mouse, d-pad, keyboard tabs/arrow keys, fingers). UI shall give feedback respectively to each of the input mechanisms. One input device shall not interfere with the behaviors of the others. For example, tabbing to get another UI element in focus should not result in your mouse cursor changing its current location.

Type of View.OnTouchListener

October 4, 2009

I found the documentation on the type of android.view.View.OnTouchListener misleading. It says: “Return true if the listener has consumed the event, false otherwise.”

For me, it looks like it should almost be the opposite:

“Return true if the event is allowed to propagate, false otherwise.”

//Sample code

public boolean onTouch(View v, evt:MotionEvent) {
switch(evt.getAction) {
case MotionEvent.ACTION_DOWN:
return true; // should allow propagate since we need to handle it in MOTION_MOVE block
case MotionEvent.ACTION_MOVE:
Log.d(“TOUCH”, “historical size is “+getHistoricalSize());
return false;
default:
break;
}
return false;//for any motions other than DOWN, we choose to block them. Unless other code want to handle it.
}

Please do feel free to correct me if I am wrong. Thanks.