allensdk.brain_observatory.stimulus_info module

class allensdk.brain_observatory.stimulus_info.BinaryIntervalSearchTree(search_list)[source]

Bases: object

add(input_list, tmp=None)[source]
static from_df(input_df)[source]
search(fi, tmp=None)[source]
class allensdk.brain_observatory.stimulus_info.BrainObservatoryMonitor(experiment_geometry=None)[source]

Bases: allensdk.brain_observatory.stimulus_info.Monitor

http://help.brain-map.org/display/observatory/Documentation?preview=/10616846/10813485/VisualCoding_VisualStimuli.pdf https://www.cnet.com/products/asus-pa248q/specs/

grating_to_screen(phase, spatial_frequency, orientation, **kwargs)[source]
lsn_image_to_screen(img, **kwargs)[source]
pixels_to_visual_degrees(n, **kwargs)[source]
visual_degrees_to_pixels(vd, **kwargs)[source]
warp_image(img, **kwargs)[source]
class allensdk.brain_observatory.stimulus_info.ExperimentGeometry(distance, mon_height_cm, mon_width_cm, mon_res, eyepoint)[source]

Bases: object

generate_warp_coordinates()[source]
warp_coordinates
class allensdk.brain_observatory.stimulus_info.Monitor(n_pixels_r, n_pixels_c, panel_size, spatial_unit)[source]

Bases: object

aspect_ratio
get_mask()[source]
grating_to_screen(phase, spatial_frequency, orientation, distance_from_monitor, p2p_amp=256, baseline=127, translation=(0, 0))[source]
height
lsn_image_to_screen(img, stimulus_type, origin='lower', background_color=127, translation=(0, 0))[source]
map_stimulus(source_stimulus_coordinate, source_stimulus_type, target_stimulus_type)[source]
mask
natural_movie_image_to_screen(img, origin='lower', translation=(0, 0))[source]
natural_scene_image_to_screen(img, origin='lower', translation=(0, 0))[source]
panel_size
pixel_size
pixels_to_visual_degrees(n, distance_from_monitor, small_angle_approximation=True)[source]
set_spatial_unit(new_unit)[source]
show_image(img, ax=None, show=True, mask=False, warp=False, origin='lower')[source]
spatial_frequency_to_pix_per_cycle(spatial_frequency, distance_from_monitor)[source]
visual_degrees_to_pixels(vd, distance_from_monitor, small_angle_approximation=True)[source]
width
class allensdk.brain_observatory.stimulus_info.StimulusSearch(nwb_dataset)[source]

Bases: object

search(**kwargs)[source]
allensdk.brain_observatory.stimulus_info.all_stimuli()[source]

Return a list of all stimuli in the data set

allensdk.brain_observatory.stimulus_info.get_spatial_grating(height=None, aspect_ratio=None, ori=None, pix_per_cycle=None, phase=None, p2p_amp=2, baseline=0)[source]
allensdk.brain_observatory.stimulus_info.get_spatio_temporal_grating(t, temporal_frequency=None, **kwargs)[source]
allensdk.brain_observatory.stimulus_info.lsn_coordinate_to_monitor_coordinate(lsn_coordinate, monitor_shape, stimulus_type)[source]
allensdk.brain_observatory.stimulus_info.make_display_mask(display_shape=(1920, 1200))[source]

Build a display-shaped mask that indicates which pixels are on screen after warping the stimulus.

allensdk.brain_observatory.stimulus_info.map_monitor_coordinate_to_stimulus_coordinate(monitor_coordinate, monitor_shape, stimulus_type)[source]
allensdk.brain_observatory.stimulus_info.map_monitor_coordinate_to_template_coordinate(monitor_coord, monitor_shape, template_shape)[source]
allensdk.brain_observatory.stimulus_info.map_stimulus(source_stimulus_coordinate, source_stimulus_type, target_stimulus_type, monitor_shape)[source]
allensdk.brain_observatory.stimulus_info.map_stimulus_coordinate_to_monitor_coordinate(template_coordinate, monitor_shape, stimulus_type)[source]
allensdk.brain_observatory.stimulus_info.map_template_coordinate_to_monitor_coordinate(template_coord, monitor_shape, template_shape)[source]
allensdk.brain_observatory.stimulus_info.mask_stimulus_template(template_display_coords, template_shape, display_mask=None, threshold=1.0)[source]

Build a mask for a stimulus template of a given shape and display coordinates that indicates which part of the template is on screen after warping.

Parameters:
template_display_coords: list

list of (x,y) display coordinates

template_shape: tuple

(width,height) of the display template

display_mask: np.ndarray

boolean 2D mask indicating which display coordinates are on screen after warping.

threshold: float

Fraction of pixels associated with a template display coordinate that should remain on screen to count as belonging to the mask.

Returns:
tuple: (template mask, pixel fraction)
allensdk.brain_observatory.stimulus_info.monitor_coordinate_to_lsn_coordinate(monitor_coordinate, monitor_shape, stimulus_type)[source]
allensdk.brain_observatory.stimulus_info.monitor_coordinate_to_natural_movie_coordinate(monitor_coordinate, monitor_shape)[source]
allensdk.brain_observatory.stimulus_info.natural_movie_coordinate_to_monitor_coordinate(natural_movie_coordinate, monitor_shape)[source]
allensdk.brain_observatory.stimulus_info.natural_scene_coordinate_to_monitor_coordinate(natural_scene_coordinate, monitor_shape)[source]
allensdk.brain_observatory.stimulus_info.rotate(X, Y, theta)[source]
allensdk.brain_observatory.stimulus_info.sessions_with_stimulus(stimulus)[source]

Return the names of the sessions that contain a given stimulus.

allensdk.brain_observatory.stimulus_info.stimuli_in_session(session)[source]

Return a list what stimuli are available in a given session.

Parameters:
session: string

Must be one of: [stimulus_info.THREE_SESSION_A, stimulus_info.THREE_SESSION_B, stimulus_info.THREE_SESSION_C, stimulus_info.THREE_SESSION_C2]

allensdk.brain_observatory.stimulus_info.translate_image_and_fill(img, translation=(0, 0))[source]
allensdk.brain_observatory.stimulus_info.warp_stimulus_coords(vertices, distance=15.0, mon_height_cm=32.5, mon_width_cm=51.0, mon_res=(1920, 1200), eyepoint=(0.5, 0.5))[source]

For a list of screen vertices, provides a corresponding list of texture coordinates.

Parameters:
vertices: numpy.ndarray

[[x0,y0], [x1,y1], …] A set of vertices to convert to texture positions.

distance: float

distance from the monitor in cm.

mon_height_cm: float

monitor height in cm

mon_width_cm: float

monitor width in cm

mon_res: tuple

monitor resolution (x,y)

eyepoint: tuple
Returns:
np.ndarray

x,y coordinates shaped like the input that describe what pixel coordinates are displayed an the input coordinates after warping the stimulus.