Apr 21 14:55:57.031186 ip-10-0-134-40 systemd[1]: Starting Kubernetes Kubelet... Apr 21 14:55:57.469266 ip-10-0-134-40 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:57.469266 ip-10-0-134-40 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 14:55:57.469266 ip-10-0-134-40 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:57.469266 ip-10-0-134-40 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 14:55:57.469266 ip-10-0-134-40 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 14:55:57.471798 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.471730 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476684 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476699 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476703 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476707 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476710 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:57.476704 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476713 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476715 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476718 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476721 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476724 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476726 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476729 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476731 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476734 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476736 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476739 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476742 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476744 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476747 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476750 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476752 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476755 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476757 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476764 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476767 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:57.476925 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476769 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476772 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476775 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476777 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476781 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476785 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476788 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476791 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476793 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476796 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476798 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476801 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476803 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476806 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476809 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476812 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476814 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476817 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476819 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:57.477418 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476822 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476825 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476827 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476830 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476833 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476835 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476838 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476840 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476843 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476846 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476848 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476851 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476853 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476856 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476858 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476861 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476864 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476867 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476874 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476876 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:57.477899 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476879 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476882 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476884 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476887 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476889 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476892 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476894 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476898 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476901 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476916 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476919 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476922 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476925 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476929 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476933 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476936 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476939 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476941 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476944 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:57.478430 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476947 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476951 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.476955 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477852 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477859 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477862 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477865 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477868 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477871 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477873 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477876 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477878 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477881 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477884 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477887 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477889 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477892 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477895 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477897 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:57.478893 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477900 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477914 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477918 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477920 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477923 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477926 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477929 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477932 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477935 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477937 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477940 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477943 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477946 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477948 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477951 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477953 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477956 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477958 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477961 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477964 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:57.479365 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477966 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477969 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477972 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477974 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477977 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477980 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477982 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477985 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477988 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477990 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477993 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477995 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.477998 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478002 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478004 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478007 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478009 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478012 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478015 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478019 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:57.479867 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478023 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478026 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478029 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478031 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478034 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478038 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478041 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478044 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478047 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478051 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478054 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478056 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478059 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478062 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478066 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478068 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478071 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478073 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478076 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:57.480368 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478078 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478081 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478084 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478086 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478089 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478091 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478094 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478097 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478100 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478102 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478105 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478171 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478178 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478193 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478197 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478202 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478206 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478210 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478214 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478217 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478220 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 14:55:57.480833 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478224 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478228 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478231 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478233 2572 flags.go:64] FLAG: --cgroup-root="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478236 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478239 2572 flags.go:64] FLAG: --client-ca-file="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478242 2572 flags.go:64] FLAG: --cloud-config="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478245 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478247 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478251 2572 flags.go:64] FLAG: --cluster-domain="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478254 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478257 2572 flags.go:64] FLAG: --config-dir="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478260 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478263 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478267 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478270 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478273 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478277 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478280 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478283 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478286 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478289 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478292 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478296 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478299 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 14:55:57.481357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478302 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478305 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478308 2572 flags.go:64] FLAG: --enable-server="true" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478311 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478316 2572 flags.go:64] FLAG: --event-burst="100" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478319 2572 flags.go:64] FLAG: --event-qps="50" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478322 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478325 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478328 2572 flags.go:64] FLAG: --eviction-hard="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478332 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478335 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478337 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478341 2572 flags.go:64] FLAG: --eviction-soft="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478344 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478346 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478349 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478352 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478355 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478358 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478361 2572 flags.go:64] FLAG: --feature-gates="" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478365 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478367 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478370 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478374 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478377 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 21 14:55:57.481993 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478380 2572 flags.go:64] FLAG: --help="false" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478383 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478386 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478389 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478392 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478395 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478399 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478402 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478405 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478407 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478411 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478414 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478417 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478420 2572 flags.go:64] FLAG: --kube-reserved="" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478423 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478426 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478429 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478432 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478435 2572 flags.go:64] FLAG: --lock-file="" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478438 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478441 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478444 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478449 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 14:55:57.482594 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478452 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478455 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478458 2572 flags.go:64] FLAG: --logging-format="text" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478461 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478464 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478467 2572 flags.go:64] FLAG: --manifest-url="" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478470 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478474 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478477 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478481 2572 flags.go:64] FLAG: --max-pods="110" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478485 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478488 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478490 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478493 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478496 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478499 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478502 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478508 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478511 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478514 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478518 2572 flags.go:64] FLAG: --pod-cidr="" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478521 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478526 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478529 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 14:55:57.483207 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478532 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478535 2572 flags.go:64] FLAG: --port="10250" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478539 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478542 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b7d211cdeccd03f4" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478545 2572 flags.go:64] FLAG: --qos-reserved="" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478548 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478551 2572 flags.go:64] FLAG: --register-node="true" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478554 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478557 2572 flags.go:64] FLAG: --register-with-taints="" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478560 2572 flags.go:64] FLAG: --registry-burst="10" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478563 2572 flags.go:64] FLAG: --registry-qps="5" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478565 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478568 2572 flags.go:64] FLAG: --reserved-memory="" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478572 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478575 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478578 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478581 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478584 2572 flags.go:64] FLAG: --runonce="false" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478587 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478590 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478593 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478596 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478598 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478602 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478605 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478608 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 14:55:57.483863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478610 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478613 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478616 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478619 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478622 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478625 2572 flags.go:64] FLAG: --system-cgroups="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478632 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478637 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478640 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478643 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478647 2572 flags.go:64] FLAG: --tls-min-version="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478653 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478655 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478658 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478661 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478664 2572 flags.go:64] FLAG: --v="2" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478669 2572 flags.go:64] FLAG: --version="false" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478673 2572 flags.go:64] FLAG: --vmodule="" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478677 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.478680 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478762 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478765 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478768 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478771 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:57.484514 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478774 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478777 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478780 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478783 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478786 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478789 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478792 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478794 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478797 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478799 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478802 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478804 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478807 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478810 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478814 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478817 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478819 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478822 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478824 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478828 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:57.485108 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478831 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478833 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478836 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478838 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478841 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478844 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478846 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478849 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478852 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478854 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478857 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478859 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478862 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478864 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478867 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478869 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478872 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478874 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478877 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:57.485607 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478880 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478884 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478887 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478890 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478893 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478895 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478899 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478915 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478918 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478920 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478923 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478926 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478930 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478933 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478935 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478938 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478941 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478945 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478948 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:57.486109 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478951 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478954 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478957 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478960 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478963 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478965 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478968 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478971 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478973 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478976 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478979 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478981 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478983 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478986 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478989 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478991 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478994 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478996 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.478999 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.479002 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:57.486584 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.479006 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.479009 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.479011 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.479014 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.479023 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.484832 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.484850 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484893 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484898 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484918 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484923 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484926 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484930 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484933 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484936 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484939 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:57.487135 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484942 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484945 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484948 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484951 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484954 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484956 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484959 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484961 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484964 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484967 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484969 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484972 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484974 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484977 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484980 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484982 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484985 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484987 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484990 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:57.487529 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484992 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484994 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.484997 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485002 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485004 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485007 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485009 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485013 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485016 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485019 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485022 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485024 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485027 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485029 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485032 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485034 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485037 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485039 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485041 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485044 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:57.488019 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485046 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485049 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485051 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485054 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485056 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485059 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485062 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485065 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485067 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485070 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485072 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485074 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485077 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485079 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485082 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485085 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485087 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485090 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485093 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485096 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:57.488505 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485098 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485101 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485103 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485106 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485110 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485115 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485118 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485121 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485124 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485127 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485129 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485132 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485134 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485137 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485139 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485141 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485144 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:57.489012 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485146 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.485151 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485239 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485244 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485247 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485250 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485253 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485256 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485258 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485261 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485264 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485266 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485270 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485272 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485275 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485277 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 14:55:57.489424 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485280 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485282 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485285 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485287 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485290 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485292 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485294 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485297 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485299 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485302 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485305 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485307 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485310 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485312 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485315 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485317 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485319 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485322 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485325 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485327 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 14:55:57.489851 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485330 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485332 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485334 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485337 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485339 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485342 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485345 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485347 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485350 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485353 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485356 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485358 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485360 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485363 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485365 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485368 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485370 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485374 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485378 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 14:55:57.490367 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485381 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485383 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485386 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485389 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485392 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485394 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485396 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485399 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485402 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485404 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485407 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485409 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485412 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485414 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485418 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485421 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485424 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485426 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485429 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 14:55:57.490836 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485431 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485434 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485436 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485439 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485442 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485444 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485447 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485449 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485452 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485454 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485457 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485459 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485461 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:57.485464 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.485469 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 14:55:57.491303 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.486177 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 14:55:57.491670 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.487999 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 14:55:57.491670 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.489091 2572 server.go:1019] "Starting client certificate rotation" Apr 21 14:55:57.491670 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.489182 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:57.491670 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.489212 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 14:55:57.514199 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.514182 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:57.518451 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.518429 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 14:55:57.535918 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.535888 2572 log.go:25] "Validated CRI v1 runtime API" Apr 21 14:55:57.541843 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.541826 2572 log.go:25] "Validated CRI v1 image API" Apr 21 14:55:57.543032 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.543012 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 14:55:57.546494 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.546471 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9fd61650-60b7-44d1-8f3a-e70c18f4a503:/dev/nvme0n1p4 f410e476-ba96-4c93-9e20-bc34088bd4df:/dev/nvme0n1p3] Apr 21 14:55:57.546560 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.546493 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 14:55:57.547465 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.547449 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:57.552358 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.552260 2572 manager.go:217] Machine: {Timestamp:2026-04-21 14:55:57.55037045 +0000 UTC m=+0.404245903 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094388 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23258bc9cfa68d1826525652aa253a SystemUUID:ec23258b-c9cf-a68d-1826-525652aa253a BootID:e7d52c66-1365-46ce-b399-bb0c420cb2da Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4d:28:e4:e9:cb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4d:28:e4:e9:cb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:75:21:77:c3:03 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 14:55:57.552358 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.552355 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 14:55:57.552460 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.552429 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 14:55:57.554470 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.554445 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 14:55:57.554592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.554473 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-40.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 14:55:57.554631 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.554600 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 14:55:57.554631 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.554608 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 14:55:57.554631 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.554621 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:57.555873 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.555862 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 14:55:57.557572 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.557562 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:57.557684 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.557674 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 14:55:57.560217 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.560207 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 21 14:55:57.560801 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.560792 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 14:55:57.560840 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.560813 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 14:55:57.560840 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.560823 2572 kubelet.go:397] "Adding apiserver pod source" Apr 21 14:55:57.560840 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.560831 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 14:55:57.561944 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.561928 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:57.561995 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.561959 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 14:55:57.565190 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.565172 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 14:55:57.566462 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.566445 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 14:55:57.568280 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568264 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568285 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568294 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568303 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568311 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568328 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568338 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568346 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 14:55:57.568359 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568358 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 14:55:57.568612 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568367 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 14:55:57.568612 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568380 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 14:55:57.568612 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.568392 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 14:55:57.569290 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.569279 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 14:55:57.569347 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.569293 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 14:55:57.572334 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.572309 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-40.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 14:55:57.572449 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.572349 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 14:55:57.572580 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.572564 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 14:55:57.572647 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.572605 2572 server.go:1295] "Started kubelet" Apr 21 14:55:57.573593 ip-10-0-134-40 systemd[1]: Started Kubernetes Kubelet. Apr 21 14:55:57.573731 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.573599 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 14:55:57.574007 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.573946 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 14:55:57.574075 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.574033 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 14:55:57.574421 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.574407 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-40.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 14:55:57.574989 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.574974 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n2n44" Apr 21 14:55:57.575304 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.575281 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 14:55:57.576846 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.576829 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 21 14:55:57.581073 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.581057 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 14:55:57.581167 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.581062 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:57.581832 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.581814 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 14:55:57.581932 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.581835 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 14:55:57.581997 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.581972 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 14:55:57.582107 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.582088 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:57.582194 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582143 2572 factory.go:55] Registering systemd factory Apr 21 14:55:57.582194 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582161 2572 factory.go:223] Registration of the systemd container factory successfully Apr 21 14:55:57.582370 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582350 2572 factory.go:153] Registering CRI-O factory Apr 21 14:55:57.582370 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582366 2572 factory.go:223] Registration of the crio container factory successfully Apr 21 14:55:57.582476 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582418 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 14:55:57.582476 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582440 2572 factory.go:103] Registering Raw factory Apr 21 14:55:57.582476 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582456 2572 manager.go:1196] Started watching for new ooms in manager Apr 21 14:55:57.582845 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.582830 2572 manager.go:319] Starting recovery of all containers Apr 21 14:55:57.583067 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.583055 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 21 14:55:57.583153 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.583144 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 21 14:55:57.583303 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.581993 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-40.ec2.internal.18a8670cc6f07ae4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-40.ec2.internal,UID:ip-10-0-134-40.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-40.ec2.internal,},FirstTimestamp:2026-04-21 14:55:57.57257802 +0000 UTC m=+0.426453475,LastTimestamp:2026-04-21 14:55:57.57257802 +0000 UTC m=+0.426453475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-40.ec2.internal,}" Apr 21 14:55:57.587767 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.587732 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n2n44" Apr 21 14:55:57.592355 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.592341 2572 manager.go:324] Recovery completed Apr 21 14:55:57.593561 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.593503 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 21 14:55:57.594468 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.594439 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:57.596357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.596341 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.598644 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.598631 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.598705 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.598657 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.598705 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.598667 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.599844 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.599824 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 14:55:57.599844 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.599844 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 14:55:57.599975 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.599863 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 14:55:57.603062 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.603048 2572 policy_none.go:49] "None policy: Start" Apr 21 14:55:57.603121 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.603066 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 14:55:57.603121 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.603076 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 21 14:55:57.603974 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.603958 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-40.ec2.internal\" not found" node="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.634513 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634495 2572 manager.go:341] "Starting Device Plugin manager" Apr 21 14:55:57.634597 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.634564 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 14:55:57.634597 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634578 2572 server.go:85] "Starting device plugin registration server" Apr 21 14:55:57.634777 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634766 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 14:55:57.634832 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634778 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 14:55:57.634888 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634875 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 14:55:57.634975 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634962 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 14:55:57.635026 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.634976 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 14:55:57.635409 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.635391 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 14:55:57.635463 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.635436 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:57.715286 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.715262 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 14:55:57.717318 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.716466 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 14:55:57.717318 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.716494 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 14:55:57.717318 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.716509 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 14:55:57.717318 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.716515 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 14:55:57.717318 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.716541 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 14:55:57.720810 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.720770 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:57.735399 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.735385 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.736371 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.736358 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.736429 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.736383 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.736429 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.736394 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.736429 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.736412 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.754013 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.753998 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.754068 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.754015 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-40.ec2.internal\": node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:57.806457 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.806435 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:57.817519 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.817501 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal"] Apr 21 14:55:57.817588 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.817576 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.818311 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.818298 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.818392 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.818326 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.818392 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.818341 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.820470 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.820455 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.820638 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.820624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.820684 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.820651 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.821146 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821132 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.821218 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821145 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.821218 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821156 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.821218 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821168 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.821399 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821168 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.821399 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.821262 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.823388 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.823375 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.823442 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.823398 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 14:55:57.824103 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.824088 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientMemory" Apr 21 14:55:57.824161 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.824118 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 14:55:57.824161 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.824131 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeHasSufficientPID" Apr 21 14:55:57.842394 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.842376 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-40.ec2.internal\" not found" node="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.846665 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.846651 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-40.ec2.internal\" not found" node="ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.884600 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.884582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.884659 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.884604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.884659 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.884634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d94cd17bd83c799f493062012f2d96c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-40.ec2.internal\" (UID: \"9d94cd17bd83c799f493062012f2d96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.906679 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:57.906664 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:57.985782 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.985782 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.985941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.985941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d94cd17bd83c799f493062012f2d96c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-40.ec2.internal\" (UID: \"9d94cd17bd83c799f493062012f2d96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.985941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d94cd17bd83c799f493062012f2d96c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-40.ec2.internal\" (UID: \"9d94cd17bd83c799f493062012f2d96c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:57.985941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:57.985927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/826656c716bc3cb9213ccda5f90267ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal\" (UID: \"826656c716bc3cb9213ccda5f90267ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.007159 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.007140 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:58.108222 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.108200 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:58.144388 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.144368 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.149235 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.149213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.209106 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.209082 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:58.309579 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.309526 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:58.410075 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.410054 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-40.ec2.internal\" not found" Apr 21 14:55:58.427049 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.427029 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:58.481373 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.481348 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.488936 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.488917 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 14:55:58.489055 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.489033 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:58.489117 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.489084 2572 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://af8162a87a2ac4e8c9d11777733a0cc3-dfad512a69b7234f.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.134.40:55432->52.6.189.234:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.489117 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.489106 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" Apr 21 14:55:58.489117 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.489100 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:58.489236 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.489106 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 14:55:58.506054 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.506033 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 14:55:58.561630 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.561571 2572 apiserver.go:52] "Watching apiserver" Apr 21 14:55:58.573087 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.573057 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 14:55:58.574021 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.573998 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-52rdt","openshift-image-registry/node-ca-csr2d","openshift-multus/multus-tt8pf","openshift-network-diagnostics/network-check-target-576cx","openshift-network-operator/iptables-alerter-pszc9","openshift-ovn-kubernetes/ovnkube-node-rgqmx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74","openshift-cluster-node-tuning-operator/tuned-qb7rv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal","openshift-multus/multus-additional-cni-plugins-27qzj","openshift-multus/network-metrics-daemon-gzsbc","kube-system/konnectivity-agent-zfc5h","kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal"] Apr 21 14:55:58.577064 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.577047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.579063 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.579043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.579875 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.579852 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.580033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.579889 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.580166 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.580148 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 14:55:58.580228 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.580210 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gw772\"" Apr 21 14:55:58.581179 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.581383 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581353 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 14:55:58.581664 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581643 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.581748 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 14:55:58.581748 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsff7\"" Apr 21 14:55:58.581850 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.581749 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.583304 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.583284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:55:58.583402 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.583357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:55:58.583402 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.583398 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.583612 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.583596 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 14:55:58.583853 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.583829 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.583946 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.583884 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-79rjb\"" Apr 21 14:55:58.584050 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.584032 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 14:55:58.585408 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.585391 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.587405 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.587388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.587865 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.587847 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shmt9\"" Apr 21 14:55:58.587961 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.587848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.587961 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.587934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.589460 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589440 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-cnibin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-socket-dir-parent\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589484 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-registration-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7p2b\" (UniqueName: \"kubernetes.io/projected/bfd7a190-efd2-4b62-9acb-5f68c16053f5-kube-api-access-m7p2b\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-k8s-cni-cncf-io\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.589577 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-device-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-kubelet\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-cni-binary-copy\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-netns\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-hostroot\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfd7a190-efd2-4b62-9acb-5f68c16053f5-host\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfd7a190-efd2-4b62-9acb-5f68c16053f5-serviceca\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589779 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-bin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-etc-kubernetes\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqft9\" (UniqueName: \"kubernetes.io/projected/4021a281-cd8e-4558-8e54-8b6deaf37af9-kube-api-access-rqft9\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589882 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 14:50:57 +0000 UTC" deadline="2028-01-23 20:03:32.117568678 +0000 UTC" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589926 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-system-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589939 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15413h7m33.52766877s" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.589970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-multus\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590030 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-conf-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590076 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-584db\"" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-os-release\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590105 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-multus-certs\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-sys-fs\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzfdj\" (UniqueName: \"kubernetes.io/projected/fac361fb-e660-4792-b551-bbab1f86f876-kube-api-access-fzfdj\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-daemon-config\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-socket-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.590654 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.590445 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.591695 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.591681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.591758 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.591737 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:55:58.591964 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.591946 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rn6jn\"" Apr 21 14:55:58.591964 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.591958 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 14:55:58.592086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.591962 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 14:55:58.592086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.592062 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 14:55:58.593917 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.593887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.596770 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.596659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.596770 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.596660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n5rp8\"" Apr 21 14:55:58.596954 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.596813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 14:55:58.596954 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.596831 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 14:55:58.599429 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.599411 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 14:55:58.599529 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.599431 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.599529 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.599458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2znd6\"" Apr 21 14:55:58.599629 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.599557 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.600099 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.600082 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.602636 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602613 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 14:55:58.602858 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602806 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 14:55:58.602858 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602828 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 14:55:58.602858 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602842 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 14:55:58.603042 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602863 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 14:55:58.603042 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602811 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vqbxz\"" Apr 21 14:55:58.603042 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.602834 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 14:55:58.612675 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.612643 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjjjj" Apr 21 14:55:58.620009 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.619992 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjjjj" Apr 21 14:55:58.682666 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.682639 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 14:55:58.690745 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db6c90a6-c365-45f5-bad7-00c882e79192-ovn-node-metrics-cert\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.690856 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-sys\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.690856 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-var-lib-kubelet\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.690856 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-tmp\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.690856 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cnibin\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqft9\" (UniqueName: \"kubernetes.io/projected/4021a281-cd8e-4558-8e54-8b6deaf37af9-kube-api-access-rqft9\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7px\" (UniqueName: \"kubernetes.io/projected/a8dc8faa-055a-43d4-9162-2b25481ba9c8-kube-api-access-zz7px\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-system-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.690989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-conf-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-system-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-ovn\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31f48be8-c9fa-4f17-9944-60b8aaace332-tmp-dir\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.691086 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-conf-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-cni-dir\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-daemon-config\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-socket-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-env-overrides\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-kubernetes\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-run\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-host\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blv4\" (UniqueName: \"kubernetes.io/projected/f23d41da-513b-45f3-a198-6696c53a7568-kube-api-access-7blv4\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-socket-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-cnibin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-socket-dir-parent\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-cnibin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-registration-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-socket-dir-parent\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-netd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.691459 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-registration-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvc5\" (UniqueName: \"kubernetes.io/projected/db6c90a6-c365-45f5-bad7-00c882e79192-kube-api-access-wrvc5\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-systemd\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-k8s-cni-cncf-io\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-kubelet\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-k8s-cni-cncf-io\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-systemd-units\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-config\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31f48be8-c9fa-4f17-9944-60b8aaace332-hosts-file\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnc29\" (UniqueName: \"kubernetes.io/projected/31f48be8-c9fa-4f17-9944-60b8aaace332-kube-api-access-lnc29\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691650 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-lib-modules\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-os-release\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261649a7-d8b4-480d-b755-b9ece39df52e-iptables-alerter-script\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.692276 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-kubelet\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-kubelet\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-cni-binary-copy\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-netns\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-netns\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfd7a190-efd2-4b62-9acb-5f68c16053f5-host\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-conf\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfd7a190-efd2-4b62-9acb-5f68c16053f5-host\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-bin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-etc-kubernetes\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-etc-kubernetes\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-slash\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.691961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-bin\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-bin\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-script-lib\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.692768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-multus\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-modprobe-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b27340dd-c76e-48e4-a58b-6826530d3e1d-agent-certs\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-var-lib-cni-multus\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hvr\" (UniqueName: \"kubernetes.io/projected/ac89606d-af67-40a9-8819-d321ad5b6b55-kube-api-access-f5hvr\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-cni-binary-copy\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-os-release\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-multus-certs\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-node-log\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b27340dd-c76e-48e4-a58b-6826530d3e1d-konnectivity-ca\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-os-release\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-host-run-multus-certs\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-sys-fs\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4021a281-cd8e-4558-8e54-8b6deaf37af9-multus-daemon-config\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.693437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzfdj\" (UniqueName: \"kubernetes.io/projected/fac361fb-e660-4792-b551-bbab1f86f876-kube-api-access-fzfdj\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-sys-fs\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.692549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-systemd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-etc-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-log-socket\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-system-cni-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7p2b\" (UniqueName: \"kubernetes.io/projected/bfd7a190-efd2-4b62-9acb-5f68c16053f5-kube-api-access-m7p2b\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-etc-tuned\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-device-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-var-lib-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysconfig\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-device-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac361fb-e660-4792-b551-bbab1f86f876-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.693974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261649a7-d8b4-480d-b755-b9ece39df52e-host-slash\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2zx\" (UniqueName: \"kubernetes.io/projected/261649a7-d8b4-480d-b755-b9ece39df52e-kube-api-access-jv2zx\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-hostroot\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfd7a190-efd2-4b62-9acb-5f68c16053f5-serviceca\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4021a281-cd8e-4558-8e54-8b6deaf37af9-hostroot\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-netns\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.694483 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.693704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bfd7a190-efd2-4b62-9acb-5f68c16053f5-serviceca\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.702233 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.702213 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 14:55:58.703566 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.703543 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:58.703670 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.703571 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:58.703670 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.703584 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:58.703670 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.703660 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:59.203628776 +0000 UTC m=+2.057504220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:58.705335 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.705310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7p2b\" (UniqueName: \"kubernetes.io/projected/bfd7a190-efd2-4b62-9acb-5f68c16053f5-kube-api-access-m7p2b\") pod \"node-ca-csr2d\" (UID: \"bfd7a190-efd2-4b62-9acb-5f68c16053f5\") " pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.705431 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.705314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqft9\" (UniqueName: \"kubernetes.io/projected/4021a281-cd8e-4558-8e54-8b6deaf37af9-kube-api-access-rqft9\") pod \"multus-tt8pf\" (UID: \"4021a281-cd8e-4558-8e54-8b6deaf37af9\") " pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.706667 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.706649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzfdj\" (UniqueName: \"kubernetes.io/projected/fac361fb-e660-4792-b551-bbab1f86f876-kube-api-access-fzfdj\") pod \"aws-ebs-csi-driver-node-9ks74\" (UID: \"fac361fb-e660-4792-b551-bbab1f86f876\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.794346 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-netns\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794346 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db6c90a6-c365-45f5-bad7-00c882e79192-ovn-node-metrics-cert\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-sys\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-var-lib-kubelet\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-var-lib-kubelet\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-netns\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-sys\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-tmp\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794540 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cnibin\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7px\" (UniqueName: \"kubernetes.io/projected/a8dc8faa-055a-43d4-9162-2b25481ba9c8-kube-api-access-zz7px\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-ovn\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31f48be8-c9fa-4f17-9944-60b8aaace332-tmp-dir\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cnibin\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-ovn\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-env-overrides\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-kubernetes\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.794831 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-run\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-kubernetes\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-host\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31f48be8-c9fa-4f17-9944-60b8aaace332-tmp-dir\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-run\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7blv4\" (UniqueName: \"kubernetes.io/projected/f23d41da-513b-45f3-a198-6696c53a7568-kube-api-access-7blv4\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-host\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.794974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-netd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvc5\" (UniqueName: \"kubernetes.io/projected/db6c90a6-c365-45f5-bad7-00c882e79192-kube-api-access-wrvc5\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-netd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-systemd\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-kubelet\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-systemd-units\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795180 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-systemd\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795201 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-kubelet\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-systemd-units\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-env-overrides\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-config\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-run-ovn-kubernetes\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31f48be8-c9fa-4f17-9944-60b8aaace332-hosts-file\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnc29\" (UniqueName: \"kubernetes.io/projected/31f48be8-c9fa-4f17-9944-60b8aaace332-kube-api-access-lnc29\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-lib-modules\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-os-release\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/31f48be8-c9fa-4f17-9944-60b8aaace332-hosts-file\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-lib-modules\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261649a7-d8b4-480d-b755-b9ece39df52e-iptables-alerter-script\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-os-release\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795810 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-conf\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-slash\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.795970 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-config\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-bin\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysctl-conf\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795956 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-slash\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-script-lib\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-host-cni-bin\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.795967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-modprobe-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b27340dd-c76e-48e4-a58b-6826530d3e1d-agent-certs\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-modprobe-d\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hvr\" (UniqueName: \"kubernetes.io/projected/ac89606d-af67-40a9-8819-d321ad5b6b55-kube-api-access-f5hvr\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-node-log\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b27340dd-c76e-48e4-a58b-6826530d3e1d-konnectivity-ca\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.796763 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-systemd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-etc-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-log-socket\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/261649a7-d8b4-480d-b755-b9ece39df52e-iptables-alerter-script\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-system-cni-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-etc-tuned\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-var-lib-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysconfig\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db6c90a6-c365-45f5-bad7-00c882e79192-ovnkube-script-lib\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-node-log\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-run-systemd\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-etc-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-log-socket\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db6c90a6-c365-45f5-bad7-00c882e79192-var-lib-openvswitch\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f23d41da-513b-45f3-a198-6696c53a7568-etc-sysconfig\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.797592 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8dc8faa-055a-43d4-9162-2b25481ba9c8-system-cni-dir\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.796972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-tmp\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797038 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b27340dd-c76e-48e4-a58b-6826530d3e1d-konnectivity-ca\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.797081 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261649a7-d8b4-480d-b755-b9ece39df52e-host-slash\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/261649a7-d8b4-480d-b755-b9ece39df52e-host-slash\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2zx\" (UniqueName: \"kubernetes.io/projected/261649a7-d8b4-480d-b755-b9ece39df52e-kube-api-access-jv2zx\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797243 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:58.797499 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:55:59.297482012 +0000 UTC m=+2.151357452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:58.798212 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.797457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db6c90a6-c365-45f5-bad7-00c882e79192-ovn-node-metrics-cert\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.798629 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.798455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8dc8faa-055a-43d4-9162-2b25481ba9c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.798629 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.798514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b27340dd-c76e-48e4-a58b-6826530d3e1d-agent-certs\") pod \"konnectivity-agent-zfc5h\" (UID: \"b27340dd-c76e-48e4-a58b-6826530d3e1d\") " pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:58.798716 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.798701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f23d41da-513b-45f3-a198-6696c53a7568-etc-tuned\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.806118 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.806095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvc5\" (UniqueName: \"kubernetes.io/projected/db6c90a6-c365-45f5-bad7-00c882e79192-kube-api-access-wrvc5\") pod \"ovnkube-node-rgqmx\" (UID: \"db6c90a6-c365-45f5-bad7-00c882e79192\") " pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:58.807553 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.807532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blv4\" (UniqueName: \"kubernetes.io/projected/f23d41da-513b-45f3-a198-6696c53a7568-kube-api-access-7blv4\") pod \"tuned-qb7rv\" (UID: \"f23d41da-513b-45f3-a198-6696c53a7568\") " pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.808153 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.808133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnc29\" (UniqueName: \"kubernetes.io/projected/31f48be8-c9fa-4f17-9944-60b8aaace332-kube-api-access-lnc29\") pod \"node-resolver-52rdt\" (UID: \"31f48be8-c9fa-4f17-9944-60b8aaace332\") " pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.808153 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.808151 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7px\" (UniqueName: \"kubernetes.io/projected/a8dc8faa-055a-43d4-9162-2b25481ba9c8-kube-api-access-zz7px\") pod \"multus-additional-cni-plugins-27qzj\" (UID: \"a8dc8faa-055a-43d4-9162-2b25481ba9c8\") " pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.809967 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.809945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2zx\" (UniqueName: \"kubernetes.io/projected/261649a7-d8b4-480d-b755-b9ece39df52e-kube-api-access-jv2zx\") pod \"iptables-alerter-pszc9\" (UID: \"261649a7-d8b4-480d-b755-b9ece39df52e\") " pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:58.810109 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.810093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hvr\" (UniqueName: \"kubernetes.io/projected/ac89606d-af67-40a9-8819-d321ad5b6b55-kube-api-access-f5hvr\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:58.822630 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.822602 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d94cd17bd83c799f493062012f2d96c.slice/crio-e468bddfc3af32451023256a637747adce9d229cc3a9303fc497e57a21c3ea5e WatchSource:0}: Error finding container e468bddfc3af32451023256a637747adce9d229cc3a9303fc497e57a21c3ea5e: Status 404 returned error can't find the container with id e468bddfc3af32451023256a637747adce9d229cc3a9303fc497e57a21c3ea5e Apr 21 14:55:58.823211 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.823192 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod826656c716bc3cb9213ccda5f90267ef.slice/crio-d4e090e15e9498de7a6c93d843e003798f457c3a67c1627438a190689ed4b9eb WatchSource:0}: Error finding container d4e090e15e9498de7a6c93d843e003798f457c3a67c1627438a190689ed4b9eb: Status 404 returned error can't find the container with id d4e090e15e9498de7a6c93d843e003798f457c3a67c1627438a190689ed4b9eb Apr 21 14:55:58.826604 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.826587 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 14:55:58.922518 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.922499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" Apr 21 14:55:58.927809 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.927646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-csr2d" Apr 21 14:55:58.927977 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.927957 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac361fb_e660_4792_b551_bbab1f86f876.slice/crio-bfec4a2a68f4b9c7ce13ba8d9508baf5eb331098c4317f9a0edf1c59eaebff84 WatchSource:0}: Error finding container bfec4a2a68f4b9c7ce13ba8d9508baf5eb331098c4317f9a0edf1c59eaebff84: Status 404 returned error can't find the container with id bfec4a2a68f4b9c7ce13ba8d9508baf5eb331098c4317f9a0edf1c59eaebff84 Apr 21 14:55:58.933576 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.933555 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd7a190_efd2_4b62_9acb_5f68c16053f5.slice/crio-a1e951a4caac3456c5a3038212b38a4c0439628a44718c6902d7547391e6944e WatchSource:0}: Error finding container a1e951a4caac3456c5a3038212b38a4c0439628a44718c6902d7547391e6944e: Status 404 returned error can't find the container with id a1e951a4caac3456c5a3038212b38a4c0439628a44718c6902d7547391e6944e Apr 21 14:55:58.939221 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.939203 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tt8pf" Apr 21 14:55:58.944646 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.944625 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4021a281_cd8e_4558_8e54_8b6deaf37af9.slice/crio-531fbb1d8d1649f4f48cfa072c9bcb9d1671325f7a29725ba166e482fe6028e9 WatchSource:0}: Error finding container 531fbb1d8d1649f4f48cfa072c9bcb9d1671325f7a29725ba166e482fe6028e9: Status 404 returned error can't find the container with id 531fbb1d8d1649f4f48cfa072c9bcb9d1671325f7a29725ba166e482fe6028e9 Apr 21 14:55:58.958545 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.958525 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-52rdt" Apr 21 14:55:58.963724 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.963700 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f48be8_c9fa_4f17_9944_60b8aaace332.slice/crio-6a9b3273a2608b1d161cc07b89f9af31ad5cff384755e4b9b53ddf6bbde0a9fa WatchSource:0}: Error finding container 6a9b3273a2608b1d161cc07b89f9af31ad5cff384755e4b9b53ddf6bbde0a9fa: Status 404 returned error can't find the container with id 6a9b3273a2608b1d161cc07b89f9af31ad5cff384755e4b9b53ddf6bbde0a9fa Apr 21 14:55:58.967779 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.967761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" Apr 21 14:55:58.972772 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.972746 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23d41da_513b_45f3_a198_6696c53a7568.slice/crio-bd7491603b23ade460b3e0be080d977dcbc96a3f22fdf12079cb3b2373d3b115 WatchSource:0}: Error finding container bd7491603b23ade460b3e0be080d977dcbc96a3f22fdf12079cb3b2373d3b115: Status 404 returned error can't find the container with id bd7491603b23ade460b3e0be080d977dcbc96a3f22fdf12079cb3b2373d3b115 Apr 21 14:55:58.981532 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.981514 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27qzj" Apr 21 14:55:58.987700 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:58.987680 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8dc8faa_055a_43d4_9162_2b25481ba9c8.slice/crio-52601bc97c1a0e9384d352d59f7fc7f30bac4ebd09f8f8eb4946d6295b9b65c3 WatchSource:0}: Error finding container 52601bc97c1a0e9384d352d59f7fc7f30bac4ebd09f8f8eb4946d6295b9b65c3: Status 404 returned error can't find the container with id 52601bc97c1a0e9384d352d59f7fc7f30bac4ebd09f8f8eb4946d6295b9b65c3 Apr 21 14:55:58.998225 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:58.998210 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:55:59.004789 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:59.004771 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27340dd_c76e_48e4_a58b_6826530d3e1d.slice/crio-47459c4c8dff14419d068063a99fbff1d5828c3a5468a90e5c02bbcd80ad9a75 WatchSource:0}: Error finding container 47459c4c8dff14419d068063a99fbff1d5828c3a5468a90e5c02bbcd80ad9a75: Status 404 returned error can't find the container with id 47459c4c8dff14419d068063a99fbff1d5828c3a5468a90e5c02bbcd80ad9a75 Apr 21 14:55:59.018163 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.018142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pszc9" Apr 21 14:55:59.022961 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.022944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:55:59.023549 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:59.023533 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261649a7_d8b4_480d_b755_b9ece39df52e.slice/crio-94441a42e3c29d835e2e068bc6a2352e54a8e9525d86ab945160e63fc6cdc7d1 WatchSource:0}: Error finding container 94441a42e3c29d835e2e068bc6a2352e54a8e9525d86ab945160e63fc6cdc7d1: Status 404 returned error can't find the container with id 94441a42e3c29d835e2e068bc6a2352e54a8e9525d86ab945160e63fc6cdc7d1 Apr 21 14:55:59.028024 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:55:59.028008 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6c90a6_c365_45f5_bad7_00c882e79192.slice/crio-6128bb749a5c3eefb3ee8fa807875cd6a4af1de8d55ac1f57ae5be194b7ff433 WatchSource:0}: Error finding container 6128bb749a5c3eefb3ee8fa807875cd6a4af1de8d55ac1f57ae5be194b7ff433: Status 404 returned error can't find the container with id 6128bb749a5c3eefb3ee8fa807875cd6a4af1de8d55ac1f57ae5be194b7ff433 Apr 21 14:55:59.029635 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.029616 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:59.300936 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.300873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:55:59.301080 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.300952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:55:59.301144 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301091 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:55:59.301144 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301107 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:55:59.301144 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301117 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:59.301293 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301156 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:00.301142155 +0000 UTC m=+3.155017595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:55:59.301293 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301222 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:59.301293 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.301255 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:00.301244151 +0000 UTC m=+3.155119593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:55:59.384723 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.384684 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:59.622349 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.622265 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:58 +0000 UTC" deadline="2027-10-31 20:09:56.281836862 +0000 UTC" Apr 21 14:55:59.622349 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.622304 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13397h13m56.659536821s" Apr 21 14:55:59.643983 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.643960 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:55:59.719934 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.719346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:55:59.719934 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:55:59.719460 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:55:59.728324 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.728250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" event={"ID":"fac361fb-e660-4792-b551-bbab1f86f876","Type":"ContainerStarted","Data":"bfec4a2a68f4b9c7ce13ba8d9508baf5eb331098c4317f9a0edf1c59eaebff84"} Apr 21 14:55:59.732240 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.732192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" event={"ID":"826656c716bc3cb9213ccda5f90267ef","Type":"ContainerStarted","Data":"d4e090e15e9498de7a6c93d843e003798f457c3a67c1627438a190689ed4b9eb"} Apr 21 14:55:59.756523 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.756493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-52rdt" event={"ID":"31f48be8-c9fa-4f17-9944-60b8aaace332","Type":"ContainerStarted","Data":"6a9b3273a2608b1d161cc07b89f9af31ad5cff384755e4b9b53ddf6bbde0a9fa"} Apr 21 14:55:59.763876 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.763839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-csr2d" event={"ID":"bfd7a190-efd2-4b62-9acb-5f68c16053f5","Type":"ContainerStarted","Data":"a1e951a4caac3456c5a3038212b38a4c0439628a44718c6902d7547391e6944e"} Apr 21 14:55:59.765350 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.765327 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" event={"ID":"9d94cd17bd83c799f493062012f2d96c","Type":"ContainerStarted","Data":"e468bddfc3af32451023256a637747adce9d229cc3a9303fc497e57a21c3ea5e"} Apr 21 14:55:59.775240 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.775215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"6128bb749a5c3eefb3ee8fa807875cd6a4af1de8d55ac1f57ae5be194b7ff433"} Apr 21 14:55:59.782941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.782919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pszc9" event={"ID":"261649a7-d8b4-480d-b755-b9ece39df52e","Type":"ContainerStarted","Data":"94441a42e3c29d835e2e068bc6a2352e54a8e9525d86ab945160e63fc6cdc7d1"} Apr 21 14:55:59.791623 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.791581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zfc5h" event={"ID":"b27340dd-c76e-48e4-a58b-6826530d3e1d","Type":"ContainerStarted","Data":"47459c4c8dff14419d068063a99fbff1d5828c3a5468a90e5c02bbcd80ad9a75"} Apr 21 14:55:59.797762 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.797736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerStarted","Data":"52601bc97c1a0e9384d352d59f7fc7f30bac4ebd09f8f8eb4946d6295b9b65c3"} Apr 21 14:55:59.801755 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.801732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" event={"ID":"f23d41da-513b-45f3-a198-6696c53a7568","Type":"ContainerStarted","Data":"bd7491603b23ade460b3e0be080d977dcbc96a3f22fdf12079cb3b2373d3b115"} Apr 21 14:55:59.810263 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:55:59.810235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tt8pf" event={"ID":"4021a281-cd8e-4558-8e54-8b6deaf37af9","Type":"ContainerStarted","Data":"531fbb1d8d1649f4f48cfa072c9bcb9d1671325f7a29725ba166e482fe6028e9"} Apr 21 14:56:00.309880 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.309845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:00.310065 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.309942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:00.310137 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310074 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:00.310190 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310159 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:02.310140503 +0000 UTC m=+5.164015948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:00.310594 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310573 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:00.310594 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310597 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:00.310748 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310618 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:00.310748 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.310663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:02.310648333 +0000 UTC m=+5.164523776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:00.368696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.367758 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 14:56:00.624330 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.624247 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 14:50:58 +0000 UTC" deadline="2027-12-10 00:22:45.9368835 +0000 UTC" Apr 21 14:56:00.624330 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.624284 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14337h26m45.312603885s" Apr 21 14:56:00.717393 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:00.717364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:00.717562 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:00.717504 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:01.717887 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:01.717851 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:01.718322 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:01.717992 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:02.326619 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:02.326578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:02.326800 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:02.326639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:02.326800 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.326789 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:02.326930 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.326811 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:02.326930 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.326824 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:02.326930 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.326882 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:06.326863976 +0000 UTC m=+9.180739431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:02.327104 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.327022 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:02.327149 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.327112 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:06.327092825 +0000 UTC m=+9.180968280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:02.717437 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:02.716940 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:02.717437 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:02.717086 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:03.717409 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:03.716953 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:03.717409 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:03.717073 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:04.716720 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:04.716669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:04.716879 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:04.716817 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:05.717580 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:05.717483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:05.718030 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:05.717671 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:06.357320 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:06.357275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:06.357518 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:06.357361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:06.357518 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357438 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:06.357518 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357509 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:14.357492467 +0000 UTC m=+17.211367911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:06.357703 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357537 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:06.357703 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357561 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:06.357703 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357576 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:06.357703 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.357661 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:14.357644537 +0000 UTC m=+17.211519980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:06.717758 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:06.717465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:06.717758 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:06.717599 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:07.717630 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:07.717584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:07.717822 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:07.717732 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:08.716975 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:08.716936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:08.717144 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:08.717099 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:09.717099 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:09.717055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:09.717487 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:09.717175 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:10.717544 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:10.717505 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:10.717991 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:10.717648 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:11.717308 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:11.717256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:11.717505 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:11.717427 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:12.717163 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:12.717124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:12.717583 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:12.717243 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:13.717357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:13.717318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:13.717791 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:13.717459 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:14.416576 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:14.416533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:14.416785 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:14.416585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:14.416785 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416706 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:14.416894 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416788 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:30.416765428 +0000 UTC m=+33.270640874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:14.416894 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416714 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:14.416894 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416820 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:14.416894 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416833 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:14.416894 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.416889 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:30.416871641 +0000 UTC m=+33.270747081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:14.717190 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:14.717154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:14.717383 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:14.717290 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:15.716759 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:15.716723 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:15.716953 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:15.716852 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:16.716718 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:16.716685 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:16.717333 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:16.716806 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:17.718586 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.718554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:17.719134 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:17.718665 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:17.849805 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.849594 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"cc350dfdb934cd66575c6a58e08102dce7ac27afc18958a7e48778c6691d1393"} Apr 21 14:56:17.852000 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.851941 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" event={"ID":"f23d41da-513b-45f3-a198-6696c53a7568","Type":"ContainerStarted","Data":"884b86b103daac90cd344b1d2caaed3765e6406004259c0087264d2a306bdb39"} Apr 21 14:56:17.853581 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.853554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tt8pf" event={"ID":"4021a281-cd8e-4558-8e54-8b6deaf37af9","Type":"ContainerStarted","Data":"5d4fe53c70d9bac0b4a40d92eb8c17022f5bc48bfee6e6a1a4339998ab816716"} Apr 21 14:56:17.855451 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.855426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" event={"ID":"9d94cd17bd83c799f493062012f2d96c","Type":"ContainerStarted","Data":"83261ead94d2c63541683e75f6a14d287cba03a186081d5d9f7dddc675569ec6"} Apr 21 14:56:17.869881 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.869815 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qb7rv" podStartSLOduration=2.836557214 podStartE2EDuration="20.869796951s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.974448917 +0000 UTC m=+1.828324356" lastFinishedPulling="2026-04-21 14:56:17.007688651 +0000 UTC m=+19.861564093" observedRunningTime="2026-04-21 14:56:17.869404946 +0000 UTC m=+20.723280409" watchObservedRunningTime="2026-04-21 14:56:17.869796951 +0000 UTC m=+20.723672415" Apr 21 14:56:17.888152 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.888091 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tt8pf" podStartSLOduration=2.789655939 podStartE2EDuration="20.888070952s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.945962257 +0000 UTC m=+1.799837697" lastFinishedPulling="2026-04-21 14:56:17.044377263 +0000 UTC m=+19.898252710" observedRunningTime="2026-04-21 14:56:17.88729466 +0000 UTC m=+20.741170134" watchObservedRunningTime="2026-04-21 14:56:17.888070952 +0000 UTC m=+20.741946417" Apr 21 14:56:17.902401 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:17.902354 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-40.ec2.internal" podStartSLOduration=19.902339218 podStartE2EDuration="19.902339218s" podCreationTimestamp="2026-04-21 14:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:56:17.902101669 +0000 UTC m=+20.755977132" watchObservedRunningTime="2026-04-21 14:56:17.902339218 +0000 UTC m=+20.756214680" Apr 21 14:56:18.717487 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.717451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:18.717684 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:18.717634 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:18.858667 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.858629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-csr2d" event={"ID":"bfd7a190-efd2-4b62-9acb-5f68c16053f5","Type":"ContainerStarted","Data":"30f4b4e48be1d14d375f388099609ec359b7118c14691f211405d073f27c68e8"} Apr 21 14:56:18.861764 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.861737 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 14:56:18.862110 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862087 2572 generic.go:358] "Generic (PLEG): container finished" podID="db6c90a6-c365-45f5-bad7-00c882e79192" containerID="d49f6841e5d9ea29a645a76792a8b587821c988f0eb15ca519576e2a9f1758da" exitCode=1 Apr 21 14:56:18.862230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"64029e286cc98033b7a95941b79621cee4234749dd2b3f2a26de6503dc40bc35"} Apr 21 14:56:18.862230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"cf0991f35c5288b5df3925fb37e2a9a84655bfbf83e13b18374071ecd6fd44b3"} Apr 21 14:56:18.862230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"59bb146cc7e800e2bb2d7e81a32d244c96ea982c30a49cc8df1392fb437ac7ab"} Apr 21 14:56:18.862376 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"795f5a1c384ee39329ca7721bb16e44e9fe8e935c2a6650ce3dc881a2d0fbdcb"} Apr 21 14:56:18.862376 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.862247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerDied","Data":"d49f6841e5d9ea29a645a76792a8b587821c988f0eb15ca519576e2a9f1758da"} Apr 21 14:56:18.863701 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.863676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pszc9" event={"ID":"261649a7-d8b4-480d-b755-b9ece39df52e","Type":"ContainerStarted","Data":"de106b531a7a938969145af196a04b8487555ecd459c6982b5978075b5f87f86"} Apr 21 14:56:18.865214 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.865177 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zfc5h" event={"ID":"b27340dd-c76e-48e4-a58b-6826530d3e1d","Type":"ContainerStarted","Data":"af752f575cc43ea08e8caef32e9673fecbb0737bbb0fa5920396b79745a96129"} Apr 21 14:56:18.866726 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.866698 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="f1d11af1c0c42435816933620f33481916c02d52aac8d36157ee6bf1e92c3da2" exitCode=0 Apr 21 14:56:18.866796 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.866729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"f1d11af1c0c42435816933620f33481916c02d52aac8d36157ee6bf1e92c3da2"} Apr 21 14:56:18.868518 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.868490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" event={"ID":"fac361fb-e660-4792-b551-bbab1f86f876","Type":"ContainerStarted","Data":"08d3b08d574e7f98892a6332110127f6c1f648e56ea0efa7724f1b454cc97f44"} Apr 21 14:56:18.870119 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.870064 2572 generic.go:358] "Generic (PLEG): container finished" podID="826656c716bc3cb9213ccda5f90267ef" containerID="9f728ec99bcaf9eb4b4208654fc0ba71e9d9c5caac4a97bfee974801466cdccb" exitCode=0 Apr 21 14:56:18.870200 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.870141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" event={"ID":"826656c716bc3cb9213ccda5f90267ef","Type":"ContainerDied","Data":"9f728ec99bcaf9eb4b4208654fc0ba71e9d9c5caac4a97bfee974801466cdccb"} Apr 21 14:56:18.872615 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.872568 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-csr2d" podStartSLOduration=3.854057713 podStartE2EDuration="21.872553672s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.937821016 +0000 UTC m=+1.791696457" lastFinishedPulling="2026-04-21 14:56:16.956316976 +0000 UTC m=+19.810192416" observedRunningTime="2026-04-21 14:56:18.872417711 +0000 UTC m=+21.726293189" watchObservedRunningTime="2026-04-21 14:56:18.872553672 +0000 UTC m=+21.726429135" Apr 21 14:56:18.872812 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.872786 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-52rdt" event={"ID":"31f48be8-c9fa-4f17-9944-60b8aaace332","Type":"ContainerStarted","Data":"54f6970a3fc371dab1e26ff28976dc2d456bbaa046e20b55f989894747fcea5d"} Apr 21 14:56:18.923677 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.923619 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zfc5h" podStartSLOduration=3.973332354 podStartE2EDuration="21.923601798s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:59.006162897 +0000 UTC m=+1.860038337" lastFinishedPulling="2026-04-21 14:56:16.956432336 +0000 UTC m=+19.810307781" observedRunningTime="2026-04-21 14:56:18.923029233 +0000 UTC m=+21.776904697" watchObservedRunningTime="2026-04-21 14:56:18.923601798 +0000 UTC m=+21.777477259" Apr 21 14:56:18.939809 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.939760 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pszc9" podStartSLOduration=2.95912718 podStartE2EDuration="20.939741234s" podCreationTimestamp="2026-04-21 14:55:58 +0000 UTC" firstStartedPulling="2026-04-21 14:55:59.025365298 +0000 UTC m=+1.879240738" lastFinishedPulling="2026-04-21 14:56:17.005979352 +0000 UTC m=+19.859854792" observedRunningTime="2026-04-21 14:56:18.939427632 +0000 UTC m=+21.793303104" watchObservedRunningTime="2026-04-21 14:56:18.939741234 +0000 UTC m=+21.793616725" Apr 21 14:56:18.953942 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:18.953871 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-52rdt" podStartSLOduration=3.913417992 podStartE2EDuration="21.953857991s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.965408982 +0000 UTC m=+1.819284422" lastFinishedPulling="2026-04-21 14:56:17.005848975 +0000 UTC m=+19.859724421" observedRunningTime="2026-04-21 14:56:18.953750546 +0000 UTC m=+21.807626009" watchObservedRunningTime="2026-04-21 14:56:18.953857991 +0000 UTC m=+21.807733450" Apr 21 14:56:19.062456 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.062415 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:56:19.063168 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.063145 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:56:19.188609 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.188451 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 14:56:19.647113 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.646987 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T14:56:19.188604083Z","UUID":"288dcb9a-e0b2-4ba1-9eee-c30752bc6301","Handler":null,"Name":"","Endpoint":""} Apr 21 14:56:19.648834 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.648799 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 14:56:19.648834 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.648829 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 14:56:19.717802 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.717759 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:19.718004 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:19.717894 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:19.879783 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.879338 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" event={"ID":"826656c716bc3cb9213ccda5f90267ef","Type":"ContainerStarted","Data":"84ec5e421c7dd932176917f68de8174a3289ec173bbc3f8e4aab0dfe3689314f"} Apr 21 14:56:19.882529 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.882396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" event={"ID":"fac361fb-e660-4792-b551-bbab1f86f876","Type":"ContainerStarted","Data":"7940dc84d28d40aceacbe92916f81b1883eca3debd63bab658810b7fa9d5297b"} Apr 21 14:56:19.882816 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.882793 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:56:19.883960 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.883938 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zfc5h" Apr 21 14:56:19.894856 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:19.894796 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-40.ec2.internal" podStartSLOduration=21.894772672 podStartE2EDuration="21.894772672s" podCreationTimestamp="2026-04-21 14:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:56:19.894185227 +0000 UTC m=+22.748060689" watchObservedRunningTime="2026-04-21 14:56:19.894772672 +0000 UTC m=+22.748648135" Apr 21 14:56:20.717056 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:20.717014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:20.717246 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:20.717156 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:20.886611 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:20.886573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" event={"ID":"fac361fb-e660-4792-b551-bbab1f86f876","Type":"ContainerStarted","Data":"e78c50c0b7eb3eda89e7fc8a6cb67936ad779294b122931be001b2d44e499571"} Apr 21 14:56:20.889847 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:20.889819 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 14:56:20.890253 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:20.890215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"e78fc7dcfb585ba6c2141db7bbba25ed05deadd44dc4475a904dc2bef357f247"} Apr 21 14:56:20.912248 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:20.912198 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ks74" podStartSLOduration=2.823217858 podStartE2EDuration="23.912179464s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.930616969 +0000 UTC m=+1.784492411" lastFinishedPulling="2026-04-21 14:56:20.019578565 +0000 UTC m=+22.873454017" observedRunningTime="2026-04-21 14:56:20.91185083 +0000 UTC m=+23.765726292" watchObservedRunningTime="2026-04-21 14:56:20.912179464 +0000 UTC m=+23.766054925" Apr 21 14:56:21.717863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:21.717826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:21.718065 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:21.717973 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:22.717115 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.717078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:22.717641 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:22.717194 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:22.898185 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.898031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 14:56:22.898568 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.898532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"8f5a80ffbebee6652b1509fd357ba86f84476fb82fe4e0154dcb2868510469d4"} Apr 21 14:56:22.898974 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.898951 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:22.899045 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.898990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:22.899133 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.899119 2572 scope.go:117] "RemoveContainer" containerID="d49f6841e5d9ea29a645a76792a8b587821c988f0eb15ca519576e2a9f1758da" Apr 21 14:56:22.914059 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:22.914033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:23.717203 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.717168 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:23.717579 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:23.717269 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:23.903923 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.903886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 14:56:23.904292 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.904266 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" event={"ID":"db6c90a6-c365-45f5-bad7-00c882e79192","Type":"ContainerStarted","Data":"a4b594239eec68109b1b2294671a44dfbf20330ddd7a296a26a793bb77576a0f"} Apr 21 14:56:23.904530 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.904511 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:23.906129 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.906106 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="d7973ab28f29e5f79469bffdf37d945cbadf32a4e51bd5e449cf863ea9c14de6" exitCode=0 Apr 21 14:56:23.906228 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.906154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"d7973ab28f29e5f79469bffdf37d945cbadf32a4e51bd5e449cf863ea9c14de6"} Apr 21 14:56:23.919787 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.919753 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:23.935960 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:23.935896 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" podStartSLOduration=7.226148582 podStartE2EDuration="25.935882126s" podCreationTimestamp="2026-04-21 14:55:58 +0000 UTC" firstStartedPulling="2026-04-21 14:55:59.029450718 +0000 UTC m=+1.883326172" lastFinishedPulling="2026-04-21 14:56:17.739184275 +0000 UTC m=+20.593059716" observedRunningTime="2026-04-21 14:56:23.934052973 +0000 UTC m=+26.787928435" watchObservedRunningTime="2026-04-21 14:56:23.935882126 +0000 UTC m=+26.789757588" Apr 21 14:56:24.717896 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.717712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:24.718352 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:24.718028 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:24.751757 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.751724 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-576cx"] Apr 21 14:56:24.751941 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.751847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:24.752007 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:24.751974 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:24.754044 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.754018 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gzsbc"] Apr 21 14:56:24.912768 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.912730 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="a2451a361b48a7156d1bcfe6aa330b2db6eb3a404812aa14f2846bc0bb66c897" exitCode=0 Apr 21 14:56:24.912978 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.912814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"a2451a361b48a7156d1bcfe6aa330b2db6eb3a404812aa14f2846bc0bb66c897"} Apr 21 14:56:24.912978 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:24.912901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:24.913108 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:24.913047 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:25.916937 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:25.916832 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="5f070427963c74ac811eac8c2722f98150e4b494ac38dcce9c38cec55911faea" exitCode=0 Apr 21 14:56:25.917392 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:25.916934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"5f070427963c74ac811eac8c2722f98150e4b494ac38dcce9c38cec55911faea"} Apr 21 14:56:26.717218 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:26.717182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:26.717218 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:26.717208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:26.717514 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:26.717305 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:26.717514 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:26.717439 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:28.717642 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:28.717606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:28.718285 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:28.717748 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:56:28.718285 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:28.717832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:28.718285 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:28.717965 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-576cx" podUID="69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5" Apr 21 14:56:30.434965 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.434931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.434988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435117 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435131 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435147 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435161 2572 projected.go:194] Error preparing data for projected volume kube-api-access-kjtfz for pod openshift-network-diagnostics/network-check-target-576cx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435181 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:02.435163055 +0000 UTC m=+65.289038508 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 14:56:30.435506 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.435208 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz podName:69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:02.435193739 +0000 UTC m=+65.289069179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kjtfz" (UniqueName: "kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz") pod "network-check-target-576cx" (UID: "69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 14:56:30.508670 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.508635 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-40.ec2.internal" event="NodeReady" Apr 21 14:56:30.508838 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.508795 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 14:56:30.555415 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.555387 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v55rn"] Apr 21 14:56:30.582795 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.582758 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vr92l"] Apr 21 14:56:30.583458 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.583434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.586440 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.586415 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 14:56:30.586688 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.586665 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:56:30.586791 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.586665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 14:56:30.593010 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.592891 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v55rn"] Apr 21 14:56:30.593010 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.592935 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vr92l"] Apr 21 14:56:30.593148 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.593025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.595851 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.595831 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 14:56:30.595976 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.595837 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:56:30.595976 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.595885 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 14:56:30.596085 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.595983 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 14:56:30.717561 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.717531 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:56:30.717739 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.717530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:56:30.720803 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.720781 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 14:56:30.720938 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.720901 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:56:30.720938 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.720931 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:56:30.721135 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.720943 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:56:30.721256 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.721238 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:56:30.737301 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.737402 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737333 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.737454 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3617c41-91e1-4dea-bc4c-4a975db40cbd-config-volume\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.737454 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737436 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv5lm\" (UniqueName: \"kubernetes.io/projected/35ffa47c-97fe-49fe-a050-659e851233d4-kube-api-access-hv5lm\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.737541 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3617c41-91e1-4dea-bc4c-4a975db40cbd-tmp-dir\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.737541 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.737513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvdx\" (UniqueName: \"kubernetes.io/projected/f3617c41-91e1-4dea-bc4c-4a975db40cbd-kube-api-access-xbvdx\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.838231 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvdx\" (UniqueName: \"kubernetes.io/projected/f3617c41-91e1-4dea-bc4c-4a975db40cbd-kube-api-access-xbvdx\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.838231 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.838482 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.838482 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.838383 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:30.838482 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3617c41-91e1-4dea-bc4c-4a975db40cbd-config-volume\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.838482 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.838470 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:31.338446735 +0000 UTC m=+34.192322180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:30.838696 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.838486 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:30.838696 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:30.838578 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:56:31.338559345 +0000 UTC m=+34.192434794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:30.838696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv5lm\" (UniqueName: \"kubernetes.io/projected/35ffa47c-97fe-49fe-a050-659e851233d4-kube-api-access-hv5lm\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.838696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3617c41-91e1-4dea-bc4c-4a975db40cbd-tmp-dir\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.838954 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.838942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3617c41-91e1-4dea-bc4c-4a975db40cbd-tmp-dir\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.839122 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.839102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3617c41-91e1-4dea-bc4c-4a975db40cbd-config-volume\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:30.849895 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.849824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv5lm\" (UniqueName: \"kubernetes.io/projected/35ffa47c-97fe-49fe-a050-659e851233d4-kube-api-access-hv5lm\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:30.856272 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:30.856250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvdx\" (UniqueName: \"kubernetes.io/projected/f3617c41-91e1-4dea-bc4c-4a975db40cbd-kube-api-access-xbvdx\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:31.342366 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:31.342321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:31.342366 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:31.342376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:31.342606 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:31.342478 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:31.342606 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:31.342488 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:31.342606 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:31.342533 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:32.342518414 +0000 UTC m=+35.196393854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:31.342606 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:31.342556 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:56:32.342540353 +0000 UTC m=+35.196415793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:31.930662 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:31.930438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerStarted","Data":"a2d845a1901b30bc2643f77ff46689752724ae56d1bc9f559e76aa7015099c31"} Apr 21 14:56:32.350896 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:32.350858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:32.351079 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:32.350951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:32.351079 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:32.351026 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:32.351151 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:32.351088 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:34.351073282 +0000 UTC m=+37.204948722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:32.351151 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:32.351026 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:32.351151 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:32.351143 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:56:34.351130623 +0000 UTC m=+37.205006075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:32.934480 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:32.934448 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="a2d845a1901b30bc2643f77ff46689752724ae56d1bc9f559e76aa7015099c31" exitCode=0 Apr 21 14:56:32.934948 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:32.934500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"a2d845a1901b30bc2643f77ff46689752724ae56d1bc9f559e76aa7015099c31"} Apr 21 14:56:33.939138 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:33.939105 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8dc8faa-055a-43d4-9162-2b25481ba9c8" containerID="e3374bb4e272a9ec578dd3ca602193be991ea2fbf1d706cf5abeb5f5abf2ee42" exitCode=0 Apr 21 14:56:33.939522 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:33.939166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerDied","Data":"e3374bb4e272a9ec578dd3ca602193be991ea2fbf1d706cf5abeb5f5abf2ee42"} Apr 21 14:56:34.366115 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:34.366075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:34.366267 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:34.366137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:34.366267 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:34.366249 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:34.366341 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:34.366309 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:56:38.366293502 +0000 UTC m=+41.220168943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:34.366341 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:34.366257 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:34.366409 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:34.366368 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:38.366358036 +0000 UTC m=+41.220233491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:34.944205 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:34.944175 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27qzj" event={"ID":"a8dc8faa-055a-43d4-9162-2b25481ba9c8","Type":"ContainerStarted","Data":"1daeda9c7d4829bc849ea14e0ea4bdf4b018697180677fc31170fadf0de267c0"} Apr 21 14:56:34.974811 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:34.974763 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-27qzj" podStartSLOduration=5.298394372 podStartE2EDuration="37.974748291s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:55:58.989057912 +0000 UTC m=+1.842933353" lastFinishedPulling="2026-04-21 14:56:31.665411828 +0000 UTC m=+34.519287272" observedRunningTime="2026-04-21 14:56:34.973228235 +0000 UTC m=+37.827103697" watchObservedRunningTime="2026-04-21 14:56:34.974748291 +0000 UTC m=+37.828623753" Apr 21 14:56:38.396428 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:38.396386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:38.396815 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:38.396450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:38.396815 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:38.396533 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:38.396815 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:38.396539 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:38.396815 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:38.396585 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:56:46.396571328 +0000 UTC m=+49.250446773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:38.396815 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:38.396599 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:56:46.396593009 +0000 UTC m=+49.250468449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:46.450797 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:46.450754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:56:46.451199 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:46.450808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:56:46.451199 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:46.450898 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:56:46.451199 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:46.450983 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:57:02.450966916 +0000 UTC m=+65.304842356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:56:46.451199 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:46.450921 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:56:46.451199 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:56:46.451043 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:02.451030951 +0000 UTC m=+65.304906391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:56:55.840553 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.840507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd"] Apr 21 14:56:55.845334 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.845309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:55.848357 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.848310 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lfcql\"" Apr 21 14:56:55.849741 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.849708 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 14:56:55.850198 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.850172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 14:56:55.850331 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.850230 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 14:56:55.850331 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.850238 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 14:56:55.853614 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.853583 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5"] Apr 21 14:56:55.856628 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.856602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:55.857078 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.857056 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd"] Apr 21 14:56:55.859562 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.859542 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 14:56:55.859661 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.859544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 14:56:55.859661 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.859583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 14:56:55.859857 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.859841 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 14:56:55.865690 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.865669 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5"] Apr 21 14:56:55.927730 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:55.927711 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rgqmx" Apr 21 14:56:56.015199 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.015302 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5vx\" (UniqueName: \"kubernetes.io/projected/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-kube-api-access-zn5vx\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.015302 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2de2a64b-fab9-4f6a-9290-789166c4da39-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.015302 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t44vm\" (UniqueName: \"kubernetes.io/projected/2de2a64b-fab9-4f6a-9290-789166c4da39-kube-api-access-t44vm\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.015302 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.015514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.015514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.015514 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.015510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.115890 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.115826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.115890 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.115884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.116026 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.115902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5vx\" (UniqueName: \"kubernetes.io/projected/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-kube-api-access-zn5vx\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.116026 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.115943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2de2a64b-fab9-4f6a-9290-789166c4da39-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.116026 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.115973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t44vm\" (UniqueName: \"kubernetes.io/projected/2de2a64b-fab9-4f6a-9290-789166c4da39-kube-api-access-t44vm\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.116183 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.116046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.116333 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.116310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.116406 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.116360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.117842 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.117809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2de2a64b-fab9-4f6a-9290-789166c4da39-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.120004 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.119975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.120130 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.120025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.120130 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.120100 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-hub\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.120254 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.120140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2de2a64b-fab9-4f6a-9290-789166c4da39-ca\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.120320 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.120295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.125209 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.125179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t44vm\" (UniqueName: \"kubernetes.io/projected/2de2a64b-fab9-4f6a-9290-789166c4da39-kube-api-access-t44vm\") pod \"cluster-proxy-proxy-agent-64c6fcc965-rpcx5\" (UID: \"2de2a64b-fab9-4f6a-9290-789166c4da39\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.125336 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.125311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5vx\" (UniqueName: \"kubernetes.io/projected/fea514dd-abf1-47c6-9cfb-c80d5ddccc14-kube-api-access-zn5vx\") pod \"managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd\" (UID: \"fea514dd-abf1-47c6-9cfb-c80d5ddccc14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.165080 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.165056 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" Apr 21 14:56:56.173716 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.173689 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:56:56.300445 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.300420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd"] Apr 21 14:56:56.303847 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:56:56.303822 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfea514dd_abf1_47c6_9cfb_c80d5ddccc14.slice/crio-91186d1647f18652c02a7e2fde6fe344b2029fe8ab1f97aff928f28efc77c468 WatchSource:0}: Error finding container 91186d1647f18652c02a7e2fde6fe344b2029fe8ab1f97aff928f28efc77c468: Status 404 returned error can't find the container with id 91186d1647f18652c02a7e2fde6fe344b2029fe8ab1f97aff928f28efc77c468 Apr 21 14:56:56.315331 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.315306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5"] Apr 21 14:56:56.317628 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:56:56.317596 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de2a64b_fab9_4f6a_9290_789166c4da39.slice/crio-917f716a9a09b17d8c56d7c196441b5673c9f75275e33425184af6821f8c0edd WatchSource:0}: Error finding container 917f716a9a09b17d8c56d7c196441b5673c9f75275e33425184af6821f8c0edd: Status 404 returned error can't find the container with id 917f716a9a09b17d8c56d7c196441b5673c9f75275e33425184af6821f8c0edd Apr 21 14:56:56.985246 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.985209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerStarted","Data":"917f716a9a09b17d8c56d7c196441b5673c9f75275e33425184af6821f8c0edd"} Apr 21 14:56:56.986261 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:56.986231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" event={"ID":"fea514dd-abf1-47c6-9cfb-c80d5ddccc14","Type":"ContainerStarted","Data":"91186d1647f18652c02a7e2fde6fe344b2029fe8ab1f97aff928f28efc77c468"} Apr 21 14:56:59.994566 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:59.994534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerStarted","Data":"62a31ce794d84eea67e50700cc988c3234eb8bac211499442d841cce1abe175a"} Apr 21 14:56:59.995775 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:56:59.995748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" event={"ID":"fea514dd-abf1-47c6-9cfb-c80d5ddccc14","Type":"ContainerStarted","Data":"540811a69db1796cb63281d7b3b6a810f5813dbc0f7c92e5d3938d835bf8ab46"} Apr 21 14:57:00.009961 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:00.009899 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8678f9b6bc-fbnfd" podStartSLOduration=1.475222142 podStartE2EDuration="5.009886555s" podCreationTimestamp="2026-04-21 14:56:55 +0000 UTC" firstStartedPulling="2026-04-21 14:56:56.30563332 +0000 UTC m=+59.159508760" lastFinishedPulling="2026-04-21 14:56:59.840297721 +0000 UTC m=+62.694173173" observedRunningTime="2026-04-21 14:57:00.00985447 +0000 UTC m=+62.863729931" watchObservedRunningTime="2026-04-21 14:57:00.009886555 +0000 UTC m=+62.863762017" Apr 21 14:57:02.465489 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.465456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:57:02.465489 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.465498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.465529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.465632 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.465694 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:57:34.465678481 +0000 UTC m=+97.319553920 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.465735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.465801 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:57:02.466039 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.465837 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:57:34.465826928 +0000 UTC m=+97.319702368 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:57:02.468579 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.468561 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 14:57:02.468731 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.468712 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 14:57:02.475963 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.475945 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:57:02.476028 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:02.475986 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:06.475974588 +0000 UTC m=+129.329850028 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : secret "metrics-daemon-secret" not found Apr 21 14:57:02.478199 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.478182 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 14:57:02.488740 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.488721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtfz\" (UniqueName: \"kubernetes.io/projected/69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5-kube-api-access-kjtfz\") pod \"network-check-target-576cx\" (UID: \"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5\") " pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:57:02.536184 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.536157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sdz22\"" Apr 21 14:57:02.543635 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.543617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:57:02.750655 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:02.750628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-576cx"] Apr 21 14:57:02.753710 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:57:02.753681 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b7cdb0_e1e4_42f7_ae35_7c38c6cc12b5.slice/crio-ea4ab3d575f4abab7c61ebb145f59b695f2a106770c5bf419d719be43051d414 WatchSource:0}: Error finding container ea4ab3d575f4abab7c61ebb145f59b695f2a106770c5bf419d719be43051d414: Status 404 returned error can't find the container with id ea4ab3d575f4abab7c61ebb145f59b695f2a106770c5bf419d719be43051d414 Apr 21 14:57:03.002965 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:03.002859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-576cx" event={"ID":"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5","Type":"ContainerStarted","Data":"ea4ab3d575f4abab7c61ebb145f59b695f2a106770c5bf419d719be43051d414"} Apr 21 14:57:03.004780 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:03.004755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerStarted","Data":"23b5b30fca6ac4d134d66e9c9cd121c70dd6b98de1bc4410d273de7aef609f2d"} Apr 21 14:57:03.004780 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:03.004783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerStarted","Data":"986c7f6bf16b2a32b09351ea17b8613f882f50d8e6f033d55a22f1fa84d3f19d"} Apr 21 14:57:03.023365 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:03.023324 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" podStartSLOduration=1.699457523 podStartE2EDuration="8.023313069s" podCreationTimestamp="2026-04-21 14:56:55 +0000 UTC" firstStartedPulling="2026-04-21 14:56:56.319266624 +0000 UTC m=+59.173142065" lastFinishedPulling="2026-04-21 14:57:02.643122156 +0000 UTC m=+65.496997611" observedRunningTime="2026-04-21 14:57:03.023026932 +0000 UTC m=+65.876902406" watchObservedRunningTime="2026-04-21 14:57:03.023313069 +0000 UTC m=+65.877188522" Apr 21 14:57:06.013829 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:06.013789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-576cx" event={"ID":"69b7cdb0-e1e4-42f7-ae35-7c38c6cc12b5","Type":"ContainerStarted","Data":"d8784820854751c19648688322b7ba057d354d82a65ee511ae21c8fc1990c40f"} Apr 21 14:57:06.014227 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:06.013925 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:57:06.029818 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:06.029770 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-576cx" podStartSLOduration=66.300270543 podStartE2EDuration="1m9.029757667s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 14:57:02.75580314 +0000 UTC m=+65.609678583" lastFinishedPulling="2026-04-21 14:57:05.485290264 +0000 UTC m=+68.339165707" observedRunningTime="2026-04-21 14:57:06.029216109 +0000 UTC m=+68.883091570" watchObservedRunningTime="2026-04-21 14:57:06.029757667 +0000 UTC m=+68.883633130" Apr 21 14:57:34.493056 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:34.493013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:57:34.493467 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:34.493074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:57:34.493467 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:34.493163 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 14:57:34.493467 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:34.493221 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert podName:35ffa47c-97fe-49fe-a050-659e851233d4 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:38.493207634 +0000 UTC m=+161.347083074 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert") pod "ingress-canary-vr92l" (UID: "35ffa47c-97fe-49fe-a050-659e851233d4") : secret "canary-serving-cert" not found Apr 21 14:57:34.493467 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:34.493163 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 14:57:34.493467 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:57:34.493312 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls podName:f3617c41-91e1-4dea-bc4c-4a975db40cbd nodeName:}" failed. No retries permitted until 2026-04-21 14:58:38.493297133 +0000 UTC m=+161.347172579 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls") pod "dns-default-v55rn" (UID: "f3617c41-91e1-4dea-bc4c-4a975db40cbd") : secret "dns-default-metrics-tls" not found Apr 21 14:57:37.019605 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:37.019575 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-576cx" Apr 21 14:57:48.961539 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:48.961511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52rdt_31f48be8-c9fa-4f17-9944-60b8aaace332/dns-node-resolver/0.log" Apr 21 14:57:50.160957 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:57:50.160928 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-csr2d_bfd7a190-efd2-4b62-9acb-5f68c16053f5/node-ca/0.log" Apr 21 14:58:00.672430 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.672084 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2jbhm"] Apr 21 14:58:00.674285 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.674267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.677164 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.677140 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 14:58:00.678472 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.678440 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 14:58:00.678472 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.678449 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s4jk5\"" Apr 21 14:58:00.678472 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.678448 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 14:58:00.678660 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.678462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 14:58:00.686772 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.686751 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2jbhm"] Apr 21 14:58:00.775921 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.775878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50db6618-1e4c-443b-bb09-94f4961f7983-crio-socket\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.776011 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.775938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50db6618-1e4c-443b-bb09-94f4961f7983-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.776011 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.775985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczw8\" (UniqueName: \"kubernetes.io/projected/50db6618-1e4c-443b-bb09-94f4961f7983-kube-api-access-qczw8\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.776085 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.776021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50db6618-1e4c-443b-bb09-94f4961f7983-data-volume\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.776085 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.776047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877164 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50db6618-1e4c-443b-bb09-94f4961f7983-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877245 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qczw8\" (UniqueName: \"kubernetes.io/projected/50db6618-1e4c-443b-bb09-94f4961f7983-kube-api-access-qczw8\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877245 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50db6618-1e4c-443b-bb09-94f4961f7983-data-volume\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877245 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877361 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:00.877346 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.877404 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50db6618-1e4c-443b-bb09-94f4961f7983-crio-socket\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877443 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:00.877415 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls podName:50db6618-1e4c-443b-bb09-94f4961f7983 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:01.3773974 +0000 UTC m=+124.231272841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2jbhm" (UID: "50db6618-1e4c-443b-bb09-94f4961f7983") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:00.877497 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877448 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50db6618-1e4c-443b-bb09-94f4961f7983-crio-socket\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877619 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50db6618-1e4c-443b-bb09-94f4961f7983-data-volume\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.877801 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.877784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50db6618-1e4c-443b-bb09-94f4961f7983-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:00.893295 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:00.893275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczw8\" (UniqueName: \"kubernetes.io/projected/50db6618-1e4c-443b-bb09-94f4961f7983-kube-api-access-qczw8\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:01.381565 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:01.381515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:01.381764 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:01.381643 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:01.381764 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:01.381715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls podName:50db6618-1e4c-443b-bb09-94f4961f7983 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:02.381697499 +0000 UTC m=+125.235572940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2jbhm" (UID: "50db6618-1e4c-443b-bb09-94f4961f7983") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:02.388328 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:02.388295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:02.388670 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:02.388411 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:02.388670 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:02.388471 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls podName:50db6618-1e4c-443b-bb09-94f4961f7983 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:04.388456838 +0000 UTC m=+127.242332283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2jbhm" (UID: "50db6618-1e4c-443b-bb09-94f4961f7983") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:04.401858 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:04.401822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:04.402247 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:04.401989 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 14:58:04.402247 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:04.402065 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls podName:50db6618-1e4c-443b-bb09-94f4961f7983 nodeName:}" failed. No retries permitted until 2026-04-21 14:58:08.402046751 +0000 UTC m=+131.255922194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls") pod "insights-runtime-extractor-2jbhm" (UID: "50db6618-1e4c-443b-bb09-94f4961f7983") : secret "insights-runtime-extractor-tls" not found Apr 21 14:58:06.515649 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:06.515607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:58:06.516121 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:06.515768 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 14:58:06.516121 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:06.515836 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs podName:ac89606d-af67-40a9-8819-d321ad5b6b55 nodeName:}" failed. No retries permitted until 2026-04-21 15:00:08.515819675 +0000 UTC m=+251.369695114 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs") pod "network-metrics-daemon-gzsbc" (UID: "ac89606d-af67-40a9-8819-d321ad5b6b55") : secret "metrics-daemon-secret" not found Apr 21 14:58:08.428648 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:08.428606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:08.430863 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:08.430840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50db6618-1e4c-443b-bb09-94f4961f7983-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2jbhm\" (UID: \"50db6618-1e4c-443b-bb09-94f4961f7983\") " pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:08.483273 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:08.483233 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2jbhm" Apr 21 14:58:08.601738 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:08.601667 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2jbhm"] Apr 21 14:58:08.605341 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:58:08.605314 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50db6618_1e4c_443b_bb09_94f4961f7983.slice/crio-7ee0871891fdcff0552d58b96807d7d271afdacfdb7af0c5000cc04ff05eeea4 WatchSource:0}: Error finding container 7ee0871891fdcff0552d58b96807d7d271afdacfdb7af0c5000cc04ff05eeea4: Status 404 returned error can't find the container with id 7ee0871891fdcff0552d58b96807d7d271afdacfdb7af0c5000cc04ff05eeea4 Apr 21 14:58:09.168925 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:09.168877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jbhm" event={"ID":"50db6618-1e4c-443b-bb09-94f4961f7983","Type":"ContainerStarted","Data":"d67f5a8866e136a4dfed5e0912b747ac25ac5daf94635d21ce49d51775d17cbf"} Apr 21 14:58:09.168925 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:09.168926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jbhm" event={"ID":"50db6618-1e4c-443b-bb09-94f4961f7983","Type":"ContainerStarted","Data":"7ee0871891fdcff0552d58b96807d7d271afdacfdb7af0c5000cc04ff05eeea4"} Apr 21 14:58:10.173376 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:10.173335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jbhm" event={"ID":"50db6618-1e4c-443b-bb09-94f4961f7983","Type":"ContainerStarted","Data":"b5fcb179db10b698b9fe2a34d58af20f24160127bcec2834fc12623e20f35475"} Apr 21 14:58:11.177178 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:11.177142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2jbhm" event={"ID":"50db6618-1e4c-443b-bb09-94f4961f7983","Type":"ContainerStarted","Data":"322b667b5459e7f656bf1ba323e859bdcdd82b80b2692d5fa583cf120209b018"} Apr 21 14:58:11.196742 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:11.196691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2jbhm" podStartSLOduration=9.314479181 podStartE2EDuration="11.196677012s" podCreationTimestamp="2026-04-21 14:58:00 +0000 UTC" firstStartedPulling="2026-04-21 14:58:08.661399939 +0000 UTC m=+131.515275378" lastFinishedPulling="2026-04-21 14:58:10.543597766 +0000 UTC m=+133.397473209" observedRunningTime="2026-04-21 14:58:11.194642961 +0000 UTC m=+134.048518423" watchObservedRunningTime="2026-04-21 14:58:11.196677012 +0000 UTC m=+134.050552504" Apr 21 14:58:21.665386 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.665348 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-677bfb6859-72cm4"] Apr 21 14:58:21.669593 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.669572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.672230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.672207 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-47f52\"" Apr 21 14:58:21.672352 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.672257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 14:58:21.673608 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.673589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 14:58:21.673722 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.673593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 14:58:21.677445 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.677427 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 14:58:21.684048 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.684027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-677bfb6859-72cm4"] Apr 21 14:58:21.831793 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-image-registry-private-configuration\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.831793 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f4f8693-25c3-43cb-be49-ff52f766df6f-ca-trust-extracted\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-certificates\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-trusted-ca\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldk8d\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-kube-api-access-ldk8d\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.831992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-installation-pull-secrets\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.832031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-tls\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.832095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.832052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-bound-sa-token\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932510 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-image-registry-private-configuration\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932510 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f4f8693-25c3-43cb-be49-ff52f766df6f-ca-trust-extracted\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-certificates\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-trusted-ca\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldk8d\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-kube-api-access-ldk8d\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932696 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-installation-pull-secrets\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932877 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-tls\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.932877 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.932872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-bound-sa-token\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.933249 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.933009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f4f8693-25c3-43cb-be49-ff52f766df6f-ca-trust-extracted\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.933544 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.933518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-certificates\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.933632 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.933613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f4f8693-25c3-43cb-be49-ff52f766df6f-trusted-ca\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.935058 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.935028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-image-registry-private-configuration\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.935182 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.935068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f4f8693-25c3-43cb-be49-ff52f766df6f-installation-pull-secrets\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.935298 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.935278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-registry-tls\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.944628 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.944607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-bound-sa-token\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.944728 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.944671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldk8d\" (UniqueName: \"kubernetes.io/projected/7f4f8693-25c3-43cb-be49-ff52f766df6f-kube-api-access-ldk8d\") pod \"image-registry-677bfb6859-72cm4\" (UID: \"7f4f8693-25c3-43cb-be49-ff52f766df6f\") " pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:21.978639 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:21.978602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:22.099689 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:22.099652 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-677bfb6859-72cm4"] Apr 21 14:58:22.102478 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:58:22.102446 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f4f8693_25c3_43cb_be49_ff52f766df6f.slice/crio-135f1ee794707856acf1b0c9206826295195293063d66734dd54b02df2db3dab WatchSource:0}: Error finding container 135f1ee794707856acf1b0c9206826295195293063d66734dd54b02df2db3dab: Status 404 returned error can't find the container with id 135f1ee794707856acf1b0c9206826295195293063d66734dd54b02df2db3dab Apr 21 14:58:22.205721 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:22.205687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" event={"ID":"7f4f8693-25c3-43cb-be49-ff52f766df6f","Type":"ContainerStarted","Data":"11455e90c06e12078d28c480af105d46ffa0cea51f6c7f4f3e06ec4b66085b2d"} Apr 21 14:58:22.205721 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:22.205722 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" event={"ID":"7f4f8693-25c3-43cb-be49-ff52f766df6f","Type":"ContainerStarted","Data":"135f1ee794707856acf1b0c9206826295195293063d66734dd54b02df2db3dab"} Apr 21 14:58:22.205940 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:22.205833 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:22.228288 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:22.228230 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" podStartSLOduration=1.228211209 podStartE2EDuration="1.228211209s" podCreationTimestamp="2026-04-21 14:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 14:58:22.227346567 +0000 UTC m=+145.081222033" watchObservedRunningTime="2026-04-21 14:58:22.228211209 +0000 UTC m=+145.082086671" Apr 21 14:58:30.193721 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.193683 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wvgjh"] Apr 21 14:58:30.198575 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.198558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.201274 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.201224 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 14:58:30.201478 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.201458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 14:58:30.201587 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.201536 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v8bhv\"" Apr 21 14:58:30.202734 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.202713 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 14:58:30.202820 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.202721 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 14:58:30.202820 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.202751 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 14:58:30.202820 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.202802 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 14:58:30.290268 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290268 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-tls\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290524 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-textfile\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290524 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-root\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290524 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-metrics-client-ca\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290524 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-wtmp\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290524 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp2vh\" (UniqueName: \"kubernetes.io/projected/c4a393e9-c391-463f-ae8b-618b766b8ca3-kube-api-access-pp2vh\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290677 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-sys\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.290677 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.290566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.391796 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-sys\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.391796 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-tls\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-textfile\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.391881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-sys\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392033 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-root\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392204 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-metrics-client-ca\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392204 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-wtmp\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392204 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-root\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392204 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp2vh\" (UniqueName: \"kubernetes.io/projected/c4a393e9-c391-463f-ae8b-618b766b8ca3-kube-api-access-pp2vh\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392204 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-textfile\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392405 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-wtmp\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392596 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.392707 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.392610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a393e9-c391-463f-ae8b-618b766b8ca3-metrics-client-ca\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.394262 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.394237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-tls\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.394341 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.394327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4a393e9-c391-463f-ae8b-618b766b8ca3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.400388 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.400367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp2vh\" (UniqueName: \"kubernetes.io/projected/c4a393e9-c391-463f-ae8b-618b766b8ca3-kube-api-access-pp2vh\") pod \"node-exporter-wvgjh\" (UID: \"c4a393e9-c391-463f-ae8b-618b766b8ca3\") " pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.507761 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:30.507732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wvgjh" Apr 21 14:58:30.515427 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:58:30.515390 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a393e9_c391_463f_ae8b_618b766b8ca3.slice/crio-fb21740b0fdedb3614d7048513b9ada9a7e7d6821f848af757dcb74864a2ca25 WatchSource:0}: Error finding container fb21740b0fdedb3614d7048513b9ada9a7e7d6821f848af757dcb74864a2ca25: Status 404 returned error can't find the container with id fb21740b0fdedb3614d7048513b9ada9a7e7d6821f848af757dcb74864a2ca25 Apr 21 14:58:31.227624 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:31.227597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wvgjh" event={"ID":"c4a393e9-c391-463f-ae8b-618b766b8ca3","Type":"ContainerStarted","Data":"fb21740b0fdedb3614d7048513b9ada9a7e7d6821f848af757dcb74864a2ca25"} Apr 21 14:58:32.234520 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:32.234480 2572 generic.go:358] "Generic (PLEG): container finished" podID="c4a393e9-c391-463f-ae8b-618b766b8ca3" containerID="924ee1c9f6de13f0ef854c9fe7e54f6b29b3e2f9927fed4c8bf534a8518c5800" exitCode=0 Apr 21 14:58:32.234520 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:32.234522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wvgjh" event={"ID":"c4a393e9-c391-463f-ae8b-618b766b8ca3","Type":"ContainerDied","Data":"924ee1c9f6de13f0ef854c9fe7e54f6b29b3e2f9927fed4c8bf534a8518c5800"} Apr 21 14:58:33.239095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:33.239051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wvgjh" event={"ID":"c4a393e9-c391-463f-ae8b-618b766b8ca3","Type":"ContainerStarted","Data":"21e7cd7883ffaf2dfc141c49810bddadef3d444dca522dea8e5709804c4a5609"} Apr 21 14:58:33.239095 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:33.239098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wvgjh" event={"ID":"c4a393e9-c391-463f-ae8b-618b766b8ca3","Type":"ContainerStarted","Data":"d5670a7ec0990bd856b419ea18714d57f528e524fafe7273959dc9ea61985500"} Apr 21 14:58:33.263201 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:33.263153 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wvgjh" podStartSLOduration=2.582329453 podStartE2EDuration="3.263138117s" podCreationTimestamp="2026-04-21 14:58:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:30.517170956 +0000 UTC m=+153.371046413" lastFinishedPulling="2026-04-21 14:58:31.197979633 +0000 UTC m=+154.051855077" observedRunningTime="2026-04-21 14:58:33.261828384 +0000 UTC m=+156.115703846" watchObservedRunningTime="2026-04-21 14:58:33.263138117 +0000 UTC m=+156.117013579" Apr 21 14:58:33.595138 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:33.595055 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v55rn" podUID="f3617c41-91e1-4dea-bc4c-4a975db40cbd" Apr 21 14:58:33.602568 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:33.602540 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vr92l" podUID="35ffa47c-97fe-49fe-a050-659e851233d4" Apr 21 14:58:33.728603 ip-10-0-134-40 kubenswrapper[2572]: E0421 14:58:33.728576 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gzsbc" podUID="ac89606d-af67-40a9-8819-d321ad5b6b55" Apr 21 14:58:34.242922 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:34.242889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:34.243329 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:34.242892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:58:36.175360 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:36.175291 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" podUID="2de2a64b-fab9-4f6a-9290-789166c4da39" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:58:38.551885 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.551852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:38.552230 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.551896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:58:38.554400 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.554380 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3617c41-91e1-4dea-bc4c-4a975db40cbd-metrics-tls\") pod \"dns-default-v55rn\" (UID: \"f3617c41-91e1-4dea-bc4c-4a975db40cbd\") " pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:38.554528 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.554508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35ffa47c-97fe-49fe-a050-659e851233d4-cert\") pod \"ingress-canary-vr92l\" (UID: \"35ffa47c-97fe-49fe-a050-659e851233d4\") " pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:58:38.747223 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.747194 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7xfsn\"" Apr 21 14:58:38.747223 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.747196 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5sqkw\"" Apr 21 14:58:38.753626 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.753606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vr92l" Apr 21 14:58:38.753702 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.753694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:38.877896 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.877861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vr92l"] Apr 21 14:58:38.881648 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:58:38.881623 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35ffa47c_97fe_49fe_a050_659e851233d4.slice/crio-941d67649e62f90c146db3a9084a8c7fe068ee7643f5ab94ce1491fe38598978 WatchSource:0}: Error finding container 941d67649e62f90c146db3a9084a8c7fe068ee7643f5ab94ce1491fe38598978: Status 404 returned error can't find the container with id 941d67649e62f90c146db3a9084a8c7fe068ee7643f5ab94ce1491fe38598978 Apr 21 14:58:38.896144 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:38.896117 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v55rn"] Apr 21 14:58:38.898933 ip-10-0-134-40 kubenswrapper[2572]: W0421 14:58:38.898896 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3617c41_91e1_4dea_bc4c_4a975db40cbd.slice/crio-624bd8b9f711d6fea52e24cd8d30fe8661203682406653a55cdfbee1bffca923 WatchSource:0}: Error finding container 624bd8b9f711d6fea52e24cd8d30fe8661203682406653a55cdfbee1bffca923: Status 404 returned error can't find the container with id 624bd8b9f711d6fea52e24cd8d30fe8661203682406653a55cdfbee1bffca923 Apr 21 14:58:39.255635 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:39.255600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vr92l" event={"ID":"35ffa47c-97fe-49fe-a050-659e851233d4","Type":"ContainerStarted","Data":"941d67649e62f90c146db3a9084a8c7fe068ee7643f5ab94ce1491fe38598978"} Apr 21 14:58:39.256582 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:39.256555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v55rn" event={"ID":"f3617c41-91e1-4dea-bc4c-4a975db40cbd","Type":"ContainerStarted","Data":"624bd8b9f711d6fea52e24cd8d30fe8661203682406653a55cdfbee1bffca923"} Apr 21 14:58:41.262705 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.262663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v55rn" event={"ID":"f3617c41-91e1-4dea-bc4c-4a975db40cbd","Type":"ContainerStarted","Data":"154c446ee529e05e83627f089ef788099b272c117cb44efbe8a8a4378d4b286d"} Apr 21 14:58:41.263209 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.262713 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v55rn" event={"ID":"f3617c41-91e1-4dea-bc4c-4a975db40cbd","Type":"ContainerStarted","Data":"cfa839258b1e63ec39bd5179b5c72c1c00dca5a394968f4cb3ed5f28eeeda57f"} Apr 21 14:58:41.263209 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.262785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:41.263893 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.263872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vr92l" event={"ID":"35ffa47c-97fe-49fe-a050-659e851233d4","Type":"ContainerStarted","Data":"df84db22e4adad5d448fbf463a56bab87567f44ffd066c1407829a4d63979fbe"} Apr 21 14:58:41.279767 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.279716 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v55rn" podStartSLOduration=129.524891636 podStartE2EDuration="2m11.27970155s" podCreationTimestamp="2026-04-21 14:56:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:38.900554808 +0000 UTC m=+161.754430251" lastFinishedPulling="2026-04-21 14:58:40.655364722 +0000 UTC m=+163.509240165" observedRunningTime="2026-04-21 14:58:41.279365715 +0000 UTC m=+164.133241177" watchObservedRunningTime="2026-04-21 14:58:41.27970155 +0000 UTC m=+164.133577011" Apr 21 14:58:41.297186 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.297127 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vr92l" podStartSLOduration=129.522266222 podStartE2EDuration="2m11.297108019s" podCreationTimestamp="2026-04-21 14:56:30 +0000 UTC" firstStartedPulling="2026-04-21 14:58:38.883678428 +0000 UTC m=+161.737553870" lastFinishedPulling="2026-04-21 14:58:40.658520218 +0000 UTC m=+163.512395667" observedRunningTime="2026-04-21 14:58:41.29674177 +0000 UTC m=+164.150617233" watchObservedRunningTime="2026-04-21 14:58:41.297108019 +0000 UTC m=+164.150983481" Apr 21 14:58:41.983263 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.983215 2572 patch_prober.go:28] interesting pod/image-registry-677bfb6859-72cm4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 14:58:41.983469 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:41.983279 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" podUID="7f4f8693-25c3-43cb-be49-ff52f766df6f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 14:58:43.212192 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:43.212165 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-677bfb6859-72cm4" Apr 21 14:58:46.174637 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:46.174550 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" podUID="2de2a64b-fab9-4f6a-9290-789166c4da39" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:58:46.717510 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:46.717471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 14:58:51.272294 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:51.272264 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v55rn" Apr 21 14:58:53.853132 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:53.853067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v55rn_f3617c41-91e1-4dea-bc4c-4a975db40cbd/dns/0.log" Apr 21 14:58:54.054692 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:54.054667 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v55rn_f3617c41-91e1-4dea-bc4c-4a975db40cbd/kube-rbac-proxy/0.log" Apr 21 14:58:54.651691 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:54.651665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52rdt_31f48be8-c9fa-4f17-9944-60b8aaace332/dns-node-resolver/0.log" Apr 21 14:58:55.851723 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:55.851691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vr92l_35ffa47c-97fe-49fe-a050-659e851233d4/serve-healthcheck-canary/0.log" Apr 21 14:58:56.175250 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.175173 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" podUID="2de2a64b-fab9-4f6a-9290-789166c4da39" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 14:58:56.175250 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.175241 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" Apr 21 14:58:56.175718 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.175686 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"23b5b30fca6ac4d134d66e9c9cd121c70dd6b98de1bc4410d273de7aef609f2d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 14:58:56.175767 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.175752 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" podUID="2de2a64b-fab9-4f6a-9290-789166c4da39" containerName="service-proxy" containerID="cri-o://23b5b30fca6ac4d134d66e9c9cd121c70dd6b98de1bc4410d273de7aef609f2d" gracePeriod=30 Apr 21 14:58:56.305562 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.305538 2572 generic.go:358] "Generic (PLEG): container finished" podID="2de2a64b-fab9-4f6a-9290-789166c4da39" containerID="23b5b30fca6ac4d134d66e9c9cd121c70dd6b98de1bc4410d273de7aef609f2d" exitCode=2 Apr 21 14:58:56.305658 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:56.305573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerDied","Data":"23b5b30fca6ac4d134d66e9c9cd121c70dd6b98de1bc4410d273de7aef609f2d"} Apr 21 14:58:57.309656 ip-10-0-134-40 kubenswrapper[2572]: I0421 14:58:57.309618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64c6fcc965-rpcx5" event={"ID":"2de2a64b-fab9-4f6a-9290-789166c4da39","Type":"ContainerStarted","Data":"34aed9baffaf3b66c80b4910e21c94a6f948a4dca559a251885404208d09feed"} Apr 21 15:00:08.525102 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:08.525055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 15:00:08.527482 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:08.527458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac89606d-af67-40a9-8819-d321ad5b6b55-metrics-certs\") pod \"network-metrics-daemon-gzsbc\" (UID: \"ac89606d-af67-40a9-8819-d321ad5b6b55\") " pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 15:00:08.621704 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:08.621663 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cb4hj\"" Apr 21 15:00:08.629183 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:08.629156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzsbc" Apr 21 15:00:08.744091 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:08.744062 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gzsbc"] Apr 21 15:00:08.747650 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:00:08.747617 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac89606d_af67_40a9_8819_d321ad5b6b55.slice/crio-87c2a26625c0f64b27d98ca122c34b269b72b8afae0c7a24ae6bc7bf74e08332 WatchSource:0}: Error finding container 87c2a26625c0f64b27d98ca122c34b269b72b8afae0c7a24ae6bc7bf74e08332: Status 404 returned error can't find the container with id 87c2a26625c0f64b27d98ca122c34b269b72b8afae0c7a24ae6bc7bf74e08332 Apr 21 15:00:09.498662 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:09.498625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzsbc" event={"ID":"ac89606d-af67-40a9-8819-d321ad5b6b55","Type":"ContainerStarted","Data":"87c2a26625c0f64b27d98ca122c34b269b72b8afae0c7a24ae6bc7bf74e08332"} Apr 21 15:00:10.502968 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:10.502932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzsbc" event={"ID":"ac89606d-af67-40a9-8819-d321ad5b6b55","Type":"ContainerStarted","Data":"18e540ed4bd0ad1e9f800e3440e1ca48d4e15a7ed1366d1427e68c5f7ef97c5d"} Apr 21 15:00:10.502968 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:10.502969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzsbc" event={"ID":"ac89606d-af67-40a9-8819-d321ad5b6b55","Type":"ContainerStarted","Data":"e86233e7afe0dc361ca0c0bbfc79214a2e4544afabfc7236e99702dc4fd2bd28"} Apr 21 15:00:10.524476 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:10.524417 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gzsbc" podStartSLOduration=252.52221035 podStartE2EDuration="4m13.524399329s" podCreationTimestamp="2026-04-21 14:55:57 +0000 UTC" firstStartedPulling="2026-04-21 15:00:08.749577168 +0000 UTC m=+251.603452608" lastFinishedPulling="2026-04-21 15:00:09.751766144 +0000 UTC m=+252.605641587" observedRunningTime="2026-04-21 15:00:10.52416796 +0000 UTC m=+253.378043422" watchObservedRunningTime="2026-04-21 15:00:10.524399329 +0000 UTC m=+253.378274792" Apr 21 15:00:57.600091 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:57.600058 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:00:57.600703 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:00:57.600287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:01:33.582601 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.582565 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vvmhw"] Apr 21 15:01:33.585708 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.585693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.588465 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.588441 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:01:33.594938 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.594901 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vvmhw"] Apr 21 15:01:33.614246 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.614213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-original-pull-secret\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.614362 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.614275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-dbus\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.614362 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.614302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-kubelet-config\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.715302 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.715274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-dbus\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.715302 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.715305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-kubelet-config\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.715513 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.715343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-original-pull-secret\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.715513 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.715423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-kubelet-config\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.715513 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.715458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-dbus\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.717559 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.717542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d8fc930a-a19b-433f-8c27-8eb6887b0e8e-original-pull-secret\") pod \"global-pull-secret-syncer-vvmhw\" (UID: \"d8fc930a-a19b-433f-8c27-8eb6887b0e8e\") " pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:33.894634 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:33.894543 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vvmhw" Apr 21 15:01:34.008311 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:34.008278 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vvmhw"] Apr 21 15:01:34.011266 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:01:34.011231 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fc930a_a19b_433f_8c27_8eb6887b0e8e.slice/crio-5d06ef2b48fde581f5b7fa7668a81648979422d7619be2ddb370073d3eec8fe9 WatchSource:0}: Error finding container 5d06ef2b48fde581f5b7fa7668a81648979422d7619be2ddb370073d3eec8fe9: Status 404 returned error can't find the container with id 5d06ef2b48fde581f5b7fa7668a81648979422d7619be2ddb370073d3eec8fe9 Apr 21 15:01:34.012778 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:34.012765 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:01:34.732458 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:34.732421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vvmhw" event={"ID":"d8fc930a-a19b-433f-8c27-8eb6887b0e8e","Type":"ContainerStarted","Data":"5d06ef2b48fde581f5b7fa7668a81648979422d7619be2ddb370073d3eec8fe9"} Apr 21 15:01:38.745176 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:38.745145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vvmhw" event={"ID":"d8fc930a-a19b-433f-8c27-8eb6887b0e8e","Type":"ContainerStarted","Data":"0386f32a73cc51757798c4a5eb6e1cf265bd6bc74d937c4758dae048f27d7b34"} Apr 21 15:01:38.761178 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:01:38.761137 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vvmhw" podStartSLOduration=1.157630454 podStartE2EDuration="5.761123365s" podCreationTimestamp="2026-04-21 15:01:33 +0000 UTC" firstStartedPulling="2026-04-21 15:01:34.012888663 +0000 UTC m=+336.866764103" lastFinishedPulling="2026-04-21 15:01:38.616381557 +0000 UTC m=+341.470257014" observedRunningTime="2026-04-21 15:01:38.760215513 +0000 UTC m=+341.614091018" watchObservedRunningTime="2026-04-21 15:01:38.761123365 +0000 UTC m=+341.614998826" Apr 21 15:02:17.226925 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.226869 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx"] Apr 21 15:02:17.230003 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.229981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.232960 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.232939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:02:17.232960 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.232957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:02:17.234098 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.234083 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:02:17.239130 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.239113 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx"] Apr 21 15:02:17.327609 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.327578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.327609 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.327612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvx2g\" (UniqueName: \"kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.327800 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.327694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.428459 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.428432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.428532 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.428473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.428532 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.428490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvx2g\" (UniqueName: \"kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.428881 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.428860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.428881 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.428870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.438089 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.438063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvx2g\" (UniqueName: \"kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.539688 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.539601 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:17.655280 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.655118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx"] Apr 21 15:02:17.658014 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:02:17.657988 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b20fc1_5d03_4aa3_8ab6_3fd7d4ee8952.slice/crio-3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b WatchSource:0}: Error finding container 3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b: Status 404 returned error can't find the container with id 3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b Apr 21 15:02:17.844742 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:17.844651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" event={"ID":"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952","Type":"ContainerStarted","Data":"3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b"} Apr 21 15:02:23.860053 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:23.860012 2572 generic.go:358] "Generic (PLEG): container finished" podID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerID="c57e9bde4edd3b053cabbfca7cab84d91d2a2de36ddb7413ce5f4f70f1a38e42" exitCode=0 Apr 21 15:02:23.860540 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:23.860103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" event={"ID":"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952","Type":"ContainerDied","Data":"c57e9bde4edd3b053cabbfca7cab84d91d2a2de36ddb7413ce5f4f70f1a38e42"} Apr 21 15:02:26.870660 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:26.870617 2572 generic.go:358] "Generic (PLEG): container finished" podID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerID="ac6ade87afeca7d71a1ffd902dbcf49c9bc0f9502245cfa81db47e4e7fbb9588" exitCode=0 Apr 21 15:02:26.871057 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:26.870682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" event={"ID":"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952","Type":"ContainerDied","Data":"ac6ade87afeca7d71a1ffd902dbcf49c9bc0f9502245cfa81db47e4e7fbb9588"} Apr 21 15:02:34.894257 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:34.894213 2572 generic.go:358] "Generic (PLEG): container finished" podID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerID="1fbff947cf96afc1d918c2b4e949cd79ee878afbf02af9920d0f7f5967d6f00b" exitCode=0 Apr 21 15:02:34.894659 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:34.894294 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" event={"ID":"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952","Type":"ContainerDied","Data":"1fbff947cf96afc1d918c2b4e949cd79ee878afbf02af9920d0f7f5967d6f00b"} Apr 21 15:02:36.018212 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.018184 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:36.076852 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.076828 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvx2g\" (UniqueName: \"kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g\") pod \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " Apr 21 15:02:36.077000 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.076873 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle\") pod \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " Apr 21 15:02:36.077000 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.076898 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util\") pod \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\" (UID: \"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952\") " Apr 21 15:02:36.077519 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.077492 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle" (OuterVolumeSpecName: "bundle") pod "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" (UID: "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:02:36.078974 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.078952 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g" (OuterVolumeSpecName: "kube-api-access-nvx2g") pod "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" (UID: "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952"). InnerVolumeSpecName "kube-api-access-nvx2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:02:36.080728 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.080704 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util" (OuterVolumeSpecName: "util") pod "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" (UID: "25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:02:36.177844 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.177789 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvx2g\" (UniqueName: \"kubernetes.io/projected/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-kube-api-access-nvx2g\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:36.177844 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.177809 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:36.177844 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.177818 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:36.901350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.901311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" event={"ID":"25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952","Type":"ContainerDied","Data":"3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b"} Apr 21 15:02:36.901350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.901349 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b882451d9a9ed4e1a285bfda5fa8a14dcfd6d379fa417fac837c41175b1413b" Apr 21 15:02:36.901350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:36.901329 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgmtzx" Apr 21 15:02:45.481140 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481105 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn"] Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481332 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="util" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481342 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="util" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481350 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="extract" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481356 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="extract" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481365 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="pull" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481370 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="pull" Apr 21 15:02:45.481602 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.481415 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="25b20fc1-5d03-4aa3-8ab6-3fd7d4ee8952" containerName="extract" Apr 21 15:02:45.484174 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.484158 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.487101 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.487071 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:02:45.487209 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.487188 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:02:45.488254 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.488234 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:02:45.499099 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.499077 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn"] Apr 21 15:02:45.541350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.541317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.541517 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.541368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.541517 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.541389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5c8\" (UniqueName: \"kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.642670 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.642641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.642780 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.642687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.642850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.642812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5c8\" (UniqueName: \"kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.643060 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.643043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.643108 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.643075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.664925 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.664879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5c8\" (UniqueName: \"kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.793172 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.793087 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:45.912397 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.912214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn"] Apr 21 15:02:45.914970 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:02:45.914940 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bfe421d_2eca_4643_9a52_efde8fb9b4f9.slice/crio-0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba WatchSource:0}: Error finding container 0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba: Status 404 returned error can't find the container with id 0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba Apr 21 15:02:45.926219 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:45.926186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" event={"ID":"4bfe421d-2eca-4643-9a52-efde8fb9b4f9","Type":"ContainerStarted","Data":"0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba"} Apr 21 15:02:46.929537 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:46.929451 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerID="2d56646eb0fc72c05bbf7b95f2f23a5351135d20883a161bc6666450be384a17" exitCode=0 Apr 21 15:02:46.929953 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:46.929537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" event={"ID":"4bfe421d-2eca-4643-9a52-efde8fb9b4f9","Type":"ContainerDied","Data":"2d56646eb0fc72c05bbf7b95f2f23a5351135d20883a161bc6666450be384a17"} Apr 21 15:02:49.939176 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:49.939145 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerID="0ba24e3b660e3af0403237a1c8a1e13dc6c16b75e1a093ec5c731d5c779b77ad" exitCode=0 Apr 21 15:02:49.939529 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:49.939198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" event={"ID":"4bfe421d-2eca-4643-9a52-efde8fb9b4f9","Type":"ContainerDied","Data":"0ba24e3b660e3af0403237a1c8a1e13dc6c16b75e1a093ec5c731d5c779b77ad"} Apr 21 15:02:50.943561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:50.943528 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerID="a5b00a5707a9ae2363875c76873049ebfe32b6fb2076dcffb1e66eefa685e702" exitCode=0 Apr 21 15:02:50.943958 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:50.943604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" event={"ID":"4bfe421d-2eca-4643-9a52-efde8fb9b4f9","Type":"ContainerDied","Data":"a5b00a5707a9ae2363875c76873049ebfe32b6fb2076dcffb1e66eefa685e702"} Apr 21 15:02:52.057549 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.057526 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:02:52.087287 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.087264 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle\") pod \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " Apr 21 15:02:52.087412 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.087297 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz5c8\" (UniqueName: \"kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8\") pod \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " Apr 21 15:02:52.087412 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.087318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util\") pod \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\" (UID: \"4bfe421d-2eca-4643-9a52-efde8fb9b4f9\") " Apr 21 15:02:52.087677 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.087647 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle" (OuterVolumeSpecName: "bundle") pod "4bfe421d-2eca-4643-9a52-efde8fb9b4f9" (UID: "4bfe421d-2eca-4643-9a52-efde8fb9b4f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:02:52.089160 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.089138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8" (OuterVolumeSpecName: "kube-api-access-mz5c8") pod "4bfe421d-2eca-4643-9a52-efde8fb9b4f9" (UID: "4bfe421d-2eca-4643-9a52-efde8fb9b4f9"). InnerVolumeSpecName "kube-api-access-mz5c8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:02:52.091819 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.091794 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util" (OuterVolumeSpecName: "util") pod "4bfe421d-2eca-4643-9a52-efde8fb9b4f9" (UID: "4bfe421d-2eca-4643-9a52-efde8fb9b4f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:02:52.188075 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.188041 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:52.188075 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.188073 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz5c8\" (UniqueName: \"kubernetes.io/projected/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-kube-api-access-mz5c8\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:52.188075 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.188084 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bfe421d-2eca-4643-9a52-efde8fb9b4f9-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:02:52.950207 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.950162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" event={"ID":"4bfe421d-2eca-4643-9a52-efde8fb9b4f9","Type":"ContainerDied","Data":"0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba"} Apr 21 15:02:52.950207 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.950201 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0779e2356c18ca9a4fdb062cf4ca038ffdb203ace0a1039cdea752a075b6b2ba" Apr 21 15:02:52.950413 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:02:52.950221 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f24dpn" Apr 21 15:03:04.298728 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298690 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc"] Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298942 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="pull" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298953 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="pull" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298961 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="util" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298967 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="util" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298976 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="extract" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.298981 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="extract" Apr 21 15:03:04.299138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.299024 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bfe421d-2eca-4643-9a52-efde8fb9b4f9" containerName="extract" Apr 21 15:03:04.303455 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.303438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.306190 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.306158 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:03:04.306315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.306190 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:03:04.306315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.306214 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:03:04.310777 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.310756 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc"] Apr 21 15:03:04.374128 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.374089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdqz\" (UniqueName: \"kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.374128 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.374130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.374336 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.374196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.475432 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.475395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spdqz\" (UniqueName: \"kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.475432 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.475432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.475641 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.475619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.475777 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.475759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.475955 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.475938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.483585 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.483559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdqz\" (UniqueName: \"kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.612757 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.612644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:04.730628 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.730596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc"] Apr 21 15:03:04.733767 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:04.733734 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6218e397_c71c_4f8d_9260_039cb8599ec1.slice/crio-b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298 WatchSource:0}: Error finding container b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298: Status 404 returned error can't find the container with id b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298 Apr 21 15:03:04.983155 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.983111 2572 generic.go:358] "Generic (PLEG): container finished" podID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerID="c4099add5578af192fb824a17c8469c0aa17c41870d4c0aa7d7a7292f08dd3f2" exitCode=0 Apr 21 15:03:04.983322 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.983179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" event={"ID":"6218e397-c71c-4f8d-9260-039cb8599ec1","Type":"ContainerDied","Data":"c4099add5578af192fb824a17c8469c0aa17c41870d4c0aa7d7a7292f08dd3f2"} Apr 21 15:03:04.983322 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:04.983204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" event={"ID":"6218e397-c71c-4f8d-9260-039cb8599ec1","Type":"ContainerStarted","Data":"b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298"} Apr 21 15:03:05.988029 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:05.987995 2572 generic.go:358] "Generic (PLEG): container finished" podID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerID="b59cd28d0681455d74be58d9c71eafb7598a59af58c027ae4639e64bf3808493" exitCode=0 Apr 21 15:03:05.988431 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:05.988035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" event={"ID":"6218e397-c71c-4f8d-9260-039cb8599ec1","Type":"ContainerDied","Data":"b59cd28d0681455d74be58d9c71eafb7598a59af58c027ae4639e64bf3808493"} Apr 21 15:03:06.992653 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:06.992618 2572 generic.go:358] "Generic (PLEG): container finished" podID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerID="2517ff8280340e25e158b63b94d17af065a52731a71e201a17f9885414767325" exitCode=0 Apr 21 15:03:06.993028 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:06.992672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" event={"ID":"6218e397-c71c-4f8d-9260-039cb8599ec1","Type":"ContainerDied","Data":"2517ff8280340e25e158b63b94d17af065a52731a71e201a17f9885414767325"} Apr 21 15:03:08.110806 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.110783 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:08.204590 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.204554 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdqz\" (UniqueName: \"kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz\") pod \"6218e397-c71c-4f8d-9260-039cb8599ec1\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " Apr 21 15:03:08.204779 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.204631 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util\") pod \"6218e397-c71c-4f8d-9260-039cb8599ec1\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " Apr 21 15:03:08.204779 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.204673 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle\") pod \"6218e397-c71c-4f8d-9260-039cb8599ec1\" (UID: \"6218e397-c71c-4f8d-9260-039cb8599ec1\") " Apr 21 15:03:08.205421 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.205392 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle" (OuterVolumeSpecName: "bundle") pod "6218e397-c71c-4f8d-9260-039cb8599ec1" (UID: "6218e397-c71c-4f8d-9260-039cb8599ec1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:08.206617 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.206588 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz" (OuterVolumeSpecName: "kube-api-access-spdqz") pod "6218e397-c71c-4f8d-9260-039cb8599ec1" (UID: "6218e397-c71c-4f8d-9260-039cb8599ec1"). InnerVolumeSpecName "kube-api-access-spdqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:03:08.210157 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.210127 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util" (OuterVolumeSpecName: "util") pod "6218e397-c71c-4f8d-9260-039cb8599ec1" (UID: "6218e397-c71c-4f8d-9260-039cb8599ec1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:08.305794 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.305699 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:08.305794 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.305729 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6218e397-c71c-4f8d-9260-039cb8599ec1-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:08.305794 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.305745 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spdqz\" (UniqueName: \"kubernetes.io/projected/6218e397-c71c-4f8d-9260-039cb8599ec1-kube-api-access-spdqz\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:08.998823 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.998780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" event={"ID":"6218e397-c71c-4f8d-9260-039cb8599ec1","Type":"ContainerDied","Data":"b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298"} Apr 21 15:03:08.998823 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.998809 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wlxgc" Apr 21 15:03:08.998823 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:08.998817 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e78a0224f3c5648575d162266769cf77b10e14fd23103d39be77609bb2e298" Apr 21 15:03:18.894970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.894849 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq"] Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895163 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="extract" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895184 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="extract" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895213 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="pull" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895220 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="pull" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895235 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="util" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895244 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="util" Apr 21 15:03:18.895555 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.895298 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6218e397-c71c-4f8d-9260-039cb8599ec1" containerName="extract" Apr 21 15:03:18.898248 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.898231 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h"] Apr 21 15:03:18.898396 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.898376 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:18.901262 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.901238 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:18.905097 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.905026 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 15:03:18.905327 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.905304 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:03:18.905633 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.905357 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 15:03:18.905633 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.905393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 15:03:18.905633 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.905418 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 15:03:18.906407 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.906391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:03:18.909370 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.909349 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:03:18.909477 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.909458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-86sts\"" Apr 21 15:03:18.918591 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.918572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq"] Apr 21 15:03:18.940001 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.939971 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h"] Apr 21 15:03:18.980771 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:18.980771 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:18.981020 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:18.981020 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:18.981020 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzfv\" (UniqueName: \"kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:18.981020 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:18.980979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp49v\" (UniqueName: \"kubernetes.io/projected/007c6968-5570-4f49-817c-1fc4331bf1f3-kube-api-access-bp49v\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.081783 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp49v\" (UniqueName: \"kubernetes.io/projected/007c6968-5570-4f49-817c-1fc4331bf1f3-kube-api-access-bp49v\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.081783 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.082088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.082088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.082088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.082088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.081872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzfv\" (UniqueName: \"kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.082303 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.082256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.082303 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.082276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.084366 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.084340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.084435 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.084373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/007c6968-5570-4f49-817c-1fc4331bf1f3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.100511 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.100476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp49v\" (UniqueName: \"kubernetes.io/projected/007c6968-5570-4f49-817c-1fc4331bf1f3-kube-api-access-bp49v\") pod \"opendatahub-operator-controller-manager-6cfc874c8f-nrl6h\" (UID: \"007c6968-5570-4f49-817c-1fc4331bf1f3\") " pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.101152 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.101135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzfv\" (UniqueName: \"kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.208673 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.208644 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:19.214454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.214425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:19.356434 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.356398 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq"] Apr 21 15:03:19.358802 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:19.358772 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263b6153_719a_4f6c_a494_fad21428026f.slice/crio-5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9 WatchSource:0}: Error finding container 5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9: Status 404 returned error can't find the container with id 5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9 Apr 21 15:03:19.369771 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:19.369731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h"] Apr 21 15:03:19.372464 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:19.372427 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod007c6968_5570_4f49_817c_1fc4331bf1f3.slice/crio-a6267adb6fdf742e2452a157fb829f6593efb84c90c87aabd02bd20743ced285 WatchSource:0}: Error finding container a6267adb6fdf742e2452a157fb829f6593efb84c90c87aabd02bd20743ced285: Status 404 returned error can't find the container with id a6267adb6fdf742e2452a157fb829f6593efb84c90c87aabd02bd20743ced285 Apr 21 15:03:20.029993 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:20.029955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" event={"ID":"007c6968-5570-4f49-817c-1fc4331bf1f3","Type":"ContainerStarted","Data":"a6267adb6fdf742e2452a157fb829f6593efb84c90c87aabd02bd20743ced285"} Apr 21 15:03:20.031438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:20.031413 2572 generic.go:358] "Generic (PLEG): container finished" podID="263b6153-719a-4f6c-a494-fad21428026f" containerID="e673e437165fdf4fc3b7be25d56c7939c623d787eaa9849e60731c4839795c6e" exitCode=0 Apr 21 15:03:20.031581 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:20.031461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" event={"ID":"263b6153-719a-4f6c-a494-fad21428026f","Type":"ContainerDied","Data":"e673e437165fdf4fc3b7be25d56c7939c623d787eaa9849e60731c4839795c6e"} Apr 21 15:03:20.031581 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:20.031487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" event={"ID":"263b6153-719a-4f6c-a494-fad21428026f","Type":"ContainerStarted","Data":"5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9"} Apr 21 15:03:23.042483 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:23.042447 2572 generic.go:358] "Generic (PLEG): container finished" podID="263b6153-719a-4f6c-a494-fad21428026f" containerID="63fac9f311f3c09b7b3169fd75d21e679cf9921a7163e8b51940b983e9799615" exitCode=0 Apr 21 15:03:23.042946 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:23.042531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" event={"ID":"263b6153-719a-4f6c-a494-fad21428026f","Type":"ContainerDied","Data":"63fac9f311f3c09b7b3169fd75d21e679cf9921a7163e8b51940b983e9799615"} Apr 21 15:03:23.043940 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:23.043901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" event={"ID":"007c6968-5570-4f49-817c-1fc4331bf1f3","Type":"ContainerStarted","Data":"f37a6088af54b22fc295a69806b88d5b3e4a6cc4562ffa6b89ed8a0beedf0204"} Apr 21 15:03:23.044099 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:23.044079 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:23.081225 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:23.081186 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" podStartSLOduration=2.406256466 podStartE2EDuration="5.081174609s" podCreationTimestamp="2026-04-21 15:03:18 +0000 UTC" firstStartedPulling="2026-04-21 15:03:19.374327046 +0000 UTC m=+442.228202486" lastFinishedPulling="2026-04-21 15:03:22.049245186 +0000 UTC m=+444.903120629" observedRunningTime="2026-04-21 15:03:23.07964173 +0000 UTC m=+445.933517193" watchObservedRunningTime="2026-04-21 15:03:23.081174609 +0000 UTC m=+445.935050070" Apr 21 15:03:24.049086 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:24.049042 2572 generic.go:358] "Generic (PLEG): container finished" podID="263b6153-719a-4f6c-a494-fad21428026f" containerID="11e8cccfe84fa86866cf84e81978f781ac44cc23b0714f244e4e9c26a6657ca3" exitCode=0 Apr 21 15:03:24.049445 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:24.049102 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" event={"ID":"263b6153-719a-4f6c-a494-fad21428026f","Type":"ContainerDied","Data":"11e8cccfe84fa86866cf84e81978f781ac44cc23b0714f244e4e9c26a6657ca3"} Apr 21 15:03:25.163438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.163415 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:25.232413 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.232384 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util\") pod \"263b6153-719a-4f6c-a494-fad21428026f\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " Apr 21 15:03:25.232570 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.232421 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzfv\" (UniqueName: \"kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv\") pod \"263b6153-719a-4f6c-a494-fad21428026f\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " Apr 21 15:03:25.232570 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.232453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle\") pod \"263b6153-719a-4f6c-a494-fad21428026f\" (UID: \"263b6153-719a-4f6c-a494-fad21428026f\") " Apr 21 15:03:25.233358 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.233333 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle" (OuterVolumeSpecName: "bundle") pod "263b6153-719a-4f6c-a494-fad21428026f" (UID: "263b6153-719a-4f6c-a494-fad21428026f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:25.234477 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.234455 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv" (OuterVolumeSpecName: "kube-api-access-jwzfv") pod "263b6153-719a-4f6c-a494-fad21428026f" (UID: "263b6153-719a-4f6c-a494-fad21428026f"). InnerVolumeSpecName "kube-api-access-jwzfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:03:25.238733 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.238710 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util" (OuterVolumeSpecName: "util") pod "263b6153-719a-4f6c-a494-fad21428026f" (UID: "263b6153-719a-4f6c-a494-fad21428026f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:25.333578 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.333495 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:25.333578 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.333533 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/263b6153-719a-4f6c-a494-fad21428026f-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:25.333578 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:25.333543 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwzfv\" (UniqueName: \"kubernetes.io/projected/263b6153-719a-4f6c-a494-fad21428026f-kube-api-access-jwzfv\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:26.056082 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:26.055979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" event={"ID":"263b6153-719a-4f6c-a494-fad21428026f","Type":"ContainerDied","Data":"5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9"} Apr 21 15:03:26.056082 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:26.056033 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9m7kjq" Apr 21 15:03:26.056318 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:26.056021 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e371754a903ce202fbc077f02147a6b11850841564b5479c0ba347cf403daf9" Apr 21 15:03:31.317996 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.317953 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9"] Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318213 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="pull" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318226 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="pull" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318237 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="extract" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318244 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="extract" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318254 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="util" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318259 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="util" Apr 21 15:03:31.318441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.318307 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="263b6153-719a-4f6c-a494-fad21428026f" containerName="extract" Apr 21 15:03:31.323148 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.323127 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.327798 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.327768 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 15:03:31.327981 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.327808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 15:03:31.327981 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.327851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sz9p8\"" Apr 21 15:03:31.328121 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.328001 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 15:03:31.328121 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.328052 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 15:03:31.328121 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.328102 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:03:31.341441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.341418 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9"] Apr 21 15:03:31.376394 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.376367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-manager-config\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.376475 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.376403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9vr\" (UniqueName: \"kubernetes.io/projected/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-kube-api-access-vc9vr\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.376475 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.376456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.376554 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.376486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.477706 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.477679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.477815 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.477722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.477928 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.477881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-manager-config\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.477970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.477953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9vr\" (UniqueName: \"kubernetes.io/projected/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-kube-api-access-vc9vr\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.478480 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.478450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-manager-config\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.480529 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.480501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.480611 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.480532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-metrics-cert\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.489750 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.489720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9vr\" (UniqueName: \"kubernetes.io/projected/05ac9066-ab4c-4d3b-9a2c-ba6e873e0793-kube-api-access-vc9vr\") pod \"lws-controller-manager-586c4cccd6-vlql9\" (UID: \"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793\") " pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.632812 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.632753 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:31.820692 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:31.820665 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ac9066_ab4c_4d3b_9a2c_ba6e873e0793.slice/crio-a22c37e4d60d3f1708d668e984d7003980f33f55f239bc7d64ef0918554fff72 WatchSource:0}: Error finding container a22c37e4d60d3f1708d668e984d7003980f33f55f239bc7d64ef0918554fff72: Status 404 returned error can't find the container with id a22c37e4d60d3f1708d668e984d7003980f33f55f239bc7d64ef0918554fff72 Apr 21 15:03:31.831515 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:31.831492 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9"] Apr 21 15:03:32.073376 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:32.073343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" event={"ID":"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793","Type":"ContainerStarted","Data":"a22c37e4d60d3f1708d668e984d7003980f33f55f239bc7d64ef0918554fff72"} Apr 21 15:03:34.051735 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:34.051668 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cfc874c8f-nrl6h" Apr 21 15:03:34.081212 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:34.081180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" event={"ID":"05ac9066-ab4c-4d3b-9a2c-ba6e873e0793","Type":"ContainerStarted","Data":"525f3000de6cd40f788a3bee811d9d7d2b9332d2d944ba9e5fe6b41739d64eaf"} Apr 21 15:03:34.081331 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:34.081295 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:34.105256 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:34.105208 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" podStartSLOduration=1.219049323 podStartE2EDuration="3.105195583s" podCreationTimestamp="2026-04-21 15:03:31 +0000 UTC" firstStartedPulling="2026-04-21 15:03:31.822333979 +0000 UTC m=+454.676209419" lastFinishedPulling="2026-04-21 15:03:33.708480239 +0000 UTC m=+456.562355679" observedRunningTime="2026-04-21 15:03:34.104216875 +0000 UTC m=+456.958092362" watchObservedRunningTime="2026-04-21 15:03:34.105195583 +0000 UTC m=+456.959071046" Apr 21 15:03:36.650375 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.650338 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t"] Apr 21 15:03:36.653739 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.653715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.656915 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.656880 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:03:36.657192 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.657174 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:03:36.658031 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.658016 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:03:36.672539 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.672513 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t"] Apr 21 15:03:36.718676 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.718653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.718780 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.718681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp4gt\" (UniqueName: \"kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.718780 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.718719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.819225 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.819197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.819349 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.819230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp4gt\" (UniqueName: \"kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.819349 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.819276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.819599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.819579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.819658 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.819611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.828361 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.828343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp4gt\" (UniqueName: \"kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:36.962584 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:36.962554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:37.080398 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:37.080372 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t"] Apr 21 15:03:37.082489 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:37.082461 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc90b5f32_b98b_4d67_b7a7_5d1150f226b1.slice/crio-b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741 WatchSource:0}: Error finding container b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741: Status 404 returned error can't find the container with id b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741 Apr 21 15:03:37.090150 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:37.090124 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" event={"ID":"c90b5f32-b98b-4d67-b7a7-5d1150f226b1","Type":"ContainerStarted","Data":"b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741"} Apr 21 15:03:38.094175 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:38.094085 2572 generic.go:358] "Generic (PLEG): container finished" podID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerID="164c381e11c5c9fb1f7282ba6629cb4795fe51f9dda3a5fd263d6708e3465957" exitCode=0 Apr 21 15:03:38.094512 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:38.094177 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" event={"ID":"c90b5f32-b98b-4d67-b7a7-5d1150f226b1","Type":"ContainerDied","Data":"164c381e11c5c9fb1f7282ba6629cb4795fe51f9dda3a5fd263d6708e3465957"} Apr 21 15:03:40.100755 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:40.100714 2572 generic.go:358] "Generic (PLEG): container finished" podID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerID="e17d53f019e1b42f1689c29801f8255068244784b18af920de62a02129125de4" exitCode=0 Apr 21 15:03:40.101211 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:40.100795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" event={"ID":"c90b5f32-b98b-4d67-b7a7-5d1150f226b1","Type":"ContainerDied","Data":"e17d53f019e1b42f1689c29801f8255068244784b18af920de62a02129125de4"} Apr 21 15:03:41.105322 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:41.105291 2572 generic.go:358] "Generic (PLEG): container finished" podID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerID="f1aa00fbd34c73fcc992fde8553cac0169b2fb78907b27b5dcdf28bdc9a55450" exitCode=0 Apr 21 15:03:41.105669 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:41.105366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" event={"ID":"c90b5f32-b98b-4d67-b7a7-5d1150f226b1","Type":"ContainerDied","Data":"f1aa00fbd34c73fcc992fde8553cac0169b2fb78907b27b5dcdf28bdc9a55450"} Apr 21 15:03:42.228174 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.228143 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:42.360042 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.359942 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp4gt\" (UniqueName: \"kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt\") pod \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " Apr 21 15:03:42.360042 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.360022 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util\") pod \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " Apr 21 15:03:42.360250 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.360068 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle\") pod \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\" (UID: \"c90b5f32-b98b-4d67-b7a7-5d1150f226b1\") " Apr 21 15:03:42.360879 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.360854 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle" (OuterVolumeSpecName: "bundle") pod "c90b5f32-b98b-4d67-b7a7-5d1150f226b1" (UID: "c90b5f32-b98b-4d67-b7a7-5d1150f226b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:42.362163 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.362136 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt" (OuterVolumeSpecName: "kube-api-access-jp4gt") pod "c90b5f32-b98b-4d67-b7a7-5d1150f226b1" (UID: "c90b5f32-b98b-4d67-b7a7-5d1150f226b1"). InnerVolumeSpecName "kube-api-access-jp4gt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:03:42.410926 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.410878 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util" (OuterVolumeSpecName: "util") pod "c90b5f32-b98b-4d67-b7a7-5d1150f226b1" (UID: "c90b5f32-b98b-4d67-b7a7-5d1150f226b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:42.461245 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.461208 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:42.461245 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.461240 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:42.461245 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:42.461252 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jp4gt\" (UniqueName: \"kubernetes.io/projected/c90b5f32-b98b-4d67-b7a7-5d1150f226b1-kube-api-access-jp4gt\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:43.112769 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:43.112733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" event={"ID":"c90b5f32-b98b-4d67-b7a7-5d1150f226b1","Type":"ContainerDied","Data":"b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741"} Apr 21 15:03:43.112769 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:43.112752 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354bg8t" Apr 21 15:03:43.112769 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:43.112769 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b951bd0baef534af7d83609c625800633e3f0a68d322b1e8bc64d454a434c741" Apr 21 15:03:45.086218 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:45.086189 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-586c4cccd6-vlql9" Apr 21 15:03:50.887562 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887524 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z"] Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887772 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="util" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887782 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="util" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887792 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="pull" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887797 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="pull" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887804 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="extract" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887811 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="extract" Apr 21 15:03:50.888072 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.887856 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90b5f32-b98b-4d67-b7a7-5d1150f226b1" containerName="extract" Apr 21 15:03:50.892445 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.892422 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:50.895724 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.895699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:03:50.896929 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.896882 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:03:50.897054 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.896882 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:03:50.904279 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:50.904253 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z"] Apr 21 15:03:51.025505 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.025473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.025672 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.025513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.025672 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.025544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j7w\" (UniqueName: \"kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.125975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.125923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27j7w\" (UniqueName: \"kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.126155 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.126011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.126155 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.126037 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.126475 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.126451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.126516 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.126458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.162203 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.162116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j7w\" (UniqueName: \"kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.201620 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.201582 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:51.384253 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:51.384220 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z"] Apr 21 15:03:51.387940 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:03:51.387896 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b855be_2d57_4c71_b23f_784ee49b57fe.slice/crio-baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a WatchSource:0}: Error finding container baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a: Status 404 returned error can't find the container with id baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a Apr 21 15:03:52.139201 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:52.139164 2572 generic.go:358] "Generic (PLEG): container finished" podID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerID="82f9986483407411234a044bef5aa8006c28f39e747b0bb228459042b63e25ca" exitCode=0 Apr 21 15:03:52.139584 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:52.139216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" event={"ID":"46b855be-2d57-4c71-b23f-784ee49b57fe","Type":"ContainerDied","Data":"82f9986483407411234a044bef5aa8006c28f39e747b0bb228459042b63e25ca"} Apr 21 15:03:52.139584 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:52.139238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" event={"ID":"46b855be-2d57-4c71-b23f-784ee49b57fe","Type":"ContainerStarted","Data":"baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a"} Apr 21 15:03:53.143956 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:53.143918 2572 generic.go:358] "Generic (PLEG): container finished" podID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerID="7913f9025bfce1de52bd8bbf3ff5aef804543539566bb7f7ff837ae1b8d7783e" exitCode=0 Apr 21 15:03:53.144272 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:53.144014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" event={"ID":"46b855be-2d57-4c71-b23f-784ee49b57fe","Type":"ContainerDied","Data":"7913f9025bfce1de52bd8bbf3ff5aef804543539566bb7f7ff837ae1b8d7783e"} Apr 21 15:03:54.149302 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:54.149268 2572 generic.go:358] "Generic (PLEG): container finished" podID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerID="e349ec11ec8f204903d1d85ec32187490d14dde7ad59dd0c4d974ca0bf45853b" exitCode=0 Apr 21 15:03:54.149685 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:54.149353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" event={"ID":"46b855be-2d57-4c71-b23f-784ee49b57fe","Type":"ContainerDied","Data":"e349ec11ec8f204903d1d85ec32187490d14dde7ad59dd0c4d974ca0bf45853b"} Apr 21 15:03:55.271636 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.271606 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:03:55.358429 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.358393 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle\") pod \"46b855be-2d57-4c71-b23f-784ee49b57fe\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " Apr 21 15:03:55.358613 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.358471 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util\") pod \"46b855be-2d57-4c71-b23f-784ee49b57fe\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " Apr 21 15:03:55.358613 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.358495 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27j7w\" (UniqueName: \"kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w\") pod \"46b855be-2d57-4c71-b23f-784ee49b57fe\" (UID: \"46b855be-2d57-4c71-b23f-784ee49b57fe\") " Apr 21 15:03:55.359623 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.359581 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle" (OuterVolumeSpecName: "bundle") pod "46b855be-2d57-4c71-b23f-784ee49b57fe" (UID: "46b855be-2d57-4c71-b23f-784ee49b57fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:55.360566 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.360540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w" (OuterVolumeSpecName: "kube-api-access-27j7w") pod "46b855be-2d57-4c71-b23f-784ee49b57fe" (UID: "46b855be-2d57-4c71-b23f-784ee49b57fe"). InnerVolumeSpecName "kube-api-access-27j7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:03:55.367126 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.367093 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util" (OuterVolumeSpecName: "util") pod "46b855be-2d57-4c71-b23f-784ee49b57fe" (UID: "46b855be-2d57-4c71-b23f-784ee49b57fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:03:55.459084 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.459052 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:55.459084 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.459081 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27j7w\" (UniqueName: \"kubernetes.io/projected/46b855be-2d57-4c71-b23f-784ee49b57fe-kube-api-access-27j7w\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:55.459265 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:55.459092 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46b855be-2d57-4c71-b23f-784ee49b57fe-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:03:56.158068 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:56.157981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" event={"ID":"46b855be-2d57-4c71-b23f-784ee49b57fe","Type":"ContainerDied","Data":"baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a"} Apr 21 15:03:56.158068 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:56.158021 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baaedfea8b7bf3ac8fdb286e05f1c318bad3476f01de695ae7a0912b4c7b582a" Apr 21 15:03:56.158068 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:03:56.158048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ffc8z" Apr 21 15:04:05.080064 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080034 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080285 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="extract" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080296 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="extract" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080315 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="util" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080320 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="util" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080326 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="pull" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080331 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="pull" Apr 21 15:04:05.080438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.080372 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="46b855be-2d57-4c71-b23f-784ee49b57fe" containerName="extract" Apr 21 15:04:05.082978 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.082956 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.085920 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.085881 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 15:04:05.086047 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.085977 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:04:05.086105 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.086069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:04:05.086225 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.086200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-r5kgk\"" Apr 21 15:04:05.099578 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.099552 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:05.222788 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.222752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.222988 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.222797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.222988 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.222861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.222988 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.222932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.222988 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.222949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.223131 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.223005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.223131 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.223065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6g4\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.223131 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.223091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.223220 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.223135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.300690 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.300659 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj"] Apr 21 15:04:05.302725 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.302710 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.316257 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.316228 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj"] Apr 21 15:04:05.323658 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323769 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323769 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323888 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6g4\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323888 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323888 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323888 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.323888 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324178 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.323928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324178 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.324055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324178 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.324169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324462 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.324439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324610 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.324590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.324679 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.324665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.326291 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.326265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.326384 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.326365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.332298 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.332241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.332639 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.332618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6g4\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb77bnn29\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.393261 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.393230 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:05.425156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425304 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6838d6f4-b876-4811-bde1-5917c457710a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425304 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkfp\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-kube-api-access-fmkfp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425304 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425304 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6838d6f4-b876-4811-bde1-5917c457710a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.425565 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.425478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.517398 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.517369 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:05.519980 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:05.519954 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cc276b_18b2_4234_94ab_4cc8994155c4.slice/crio-410234b22e1d78909ce2b3fa49deb6a8e9793fe6ffaea8079a8f9bf8f9c261d3 WatchSource:0}: Error finding container 410234b22e1d78909ce2b3fa49deb6a8e9793fe6ffaea8079a8f9bf8f9c261d3: Status 404 returned error can't find the container with id 410234b22e1d78909ce2b3fa49deb6a8e9793fe6ffaea8079a8f9bf8f9c261d3 Apr 21 15:04:05.526255 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6838d6f4-b876-4811-bde1-5917c457710a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6838d6f4-b876-4811-bde1-5917c457710a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkfp\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-kube-api-access-fmkfp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526807 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526807 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526807 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.526975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.526938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.527404 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.527384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6838d6f4-b876-4811-bde1-5917c457710a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.528461 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.528438 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6838d6f4-b876-4811-bde1-5917c457710a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.528840 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.528823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6838d6f4-b876-4811-bde1-5917c457710a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.534364 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.534336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.534576 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.534561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkfp\" (UniqueName: \"kubernetes.io/projected/6838d6f4-b876-4811-bde1-5917c457710a-kube-api-access-fmkfp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fnz4fj\" (UID: \"6838d6f4-b876-4811-bde1-5917c457710a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.611717 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.611635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:05.731378 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:05.731354 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj"] Apr 21 15:04:05.733731 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:05.733707 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6838d6f4_b876_4811_bde1_5917c457710a.slice/crio-b800abecb4a052a78df3054091b4a2c58829b2c9c46103fda71ceb9f1819a619 WatchSource:0}: Error finding container b800abecb4a052a78df3054091b4a2c58829b2c9c46103fda71ceb9f1819a619: Status 404 returned error can't find the container with id b800abecb4a052a78df3054091b4a2c58829b2c9c46103fda71ceb9f1819a619 Apr 21 15:04:06.191493 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:06.191454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" event={"ID":"56cc276b-18b2-4234-94ab-4cc8994155c4","Type":"ContainerStarted","Data":"410234b22e1d78909ce2b3fa49deb6a8e9793fe6ffaea8079a8f9bf8f9c261d3"} Apr 21 15:04:06.192514 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:06.192490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" event={"ID":"6838d6f4-b876-4811-bde1-5917c457710a","Type":"ContainerStarted","Data":"b800abecb4a052a78df3054091b4a2c58829b2c9c46103fda71ceb9f1819a619"} Apr 21 15:04:08.230720 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.230680 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:08.231044 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.230766 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:08.231044 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.230809 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:08.237837 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.237362 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:08.237837 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.237445 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:08.237837 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:08.237486 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:04:09.206285 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.206246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" event={"ID":"56cc276b-18b2-4234-94ab-4cc8994155c4","Type":"ContainerStarted","Data":"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72"} Apr 21 15:04:09.207571 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.207544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" event={"ID":"6838d6f4-b876-4811-bde1-5917c457710a","Type":"ContainerStarted","Data":"b94f699e652dfb50151d26588a48a1fb38b518a65e6ec01650baf62d518d24ac"} Apr 21 15:04:09.228570 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.228532 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" podStartSLOduration=1.5197027410000001 podStartE2EDuration="4.22852089s" podCreationTimestamp="2026-04-21 15:04:05 +0000 UTC" firstStartedPulling="2026-04-21 15:04:05.521627108 +0000 UTC m=+488.375502551" lastFinishedPulling="2026-04-21 15:04:08.23044526 +0000 UTC m=+491.084320700" observedRunningTime="2026-04-21 15:04:09.227159886 +0000 UTC m=+492.081035348" watchObservedRunningTime="2026-04-21 15:04:09.22852089 +0000 UTC m=+492.082396351" Apr 21 15:04:09.245948 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.245885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" podStartSLOduration=1.744286357 podStartE2EDuration="4.24587256s" podCreationTimestamp="2026-04-21 15:04:05 +0000 UTC" firstStartedPulling="2026-04-21 15:04:05.735557483 +0000 UTC m=+488.589432923" lastFinishedPulling="2026-04-21 15:04:08.237143683 +0000 UTC m=+491.091019126" observedRunningTime="2026-04-21 15:04:09.245052975 +0000 UTC m=+492.098928438" watchObservedRunningTime="2026-04-21 15:04:09.24587256 +0000 UTC m=+492.099748035" Apr 21 15:04:09.394162 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.394134 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:09.395236 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.395213 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29 container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" start-of-body= Apr 21 15:04:09.395335 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.395258 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" Apr 21 15:04:09.612390 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.612331 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:09.617154 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:09.617130 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:10.212525 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:10.212496 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:10.213666 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:10.213643 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fnz4fj" Apr 21 15:04:10.276431 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:10.276404 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:10.393747 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:10.393719 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29 container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" start-of-body= Apr 21 15:04:10.393874 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:10.393784 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" Apr 21 15:04:11.394009 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:11.393965 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29 container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" start-of-body= Apr 21 15:04:11.394546 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:11.394051 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.20:15021/healthz/ready\": dial tcp 10.134.0.20:15021: connect: connection refused" Apr 21 15:04:12.220091 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:12.220051 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" containerID="cri-o://1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72" gracePeriod=30 Apr 21 15:04:17.443050 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.443028 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:17.517660 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517588 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv6g4\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517660 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517638 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517660 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517660 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517681 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517707 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517751 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517787 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517821 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.517943 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.517842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token\") pod \"56cc276b-18b2-4234-94ab-4cc8994155c4\" (UID: \"56cc276b-18b2-4234-94ab-4cc8994155c4\") " Apr 21 15:04:17.518240 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518005 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:17.518240 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518022 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:17.518240 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518045 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:17.518240 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518133 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:04:17.518400 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518265 2572 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-certs\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.518400 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518286 2572 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-credential-socket\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.518400 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518303 2572 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-workload-socket\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.518400 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.518263 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data" (OuterVolumeSpecName: "istio-data") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:17.519882 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.519851 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4" (OuterVolumeSpecName: "kube-api-access-dv6g4") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "kube-api-access-dv6g4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:17.520114 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.520094 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 21 15:04:17.520364 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.520348 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:17.520425 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.520369 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token" (OuterVolumeSpecName: "istio-token") pod "56cc276b-18b2-4234-94ab-4cc8994155c4" (UID: "56cc276b-18b2-4234-94ab-4cc8994155c4"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:17.619161 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619130 2572 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-data\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.619161 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619155 2572 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-podinfo\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.619161 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619165 2572 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-envoy\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.619366 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619173 2572 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-istio-token\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.619366 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619181 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dv6g4\" (UniqueName: \"kubernetes.io/projected/56cc276b-18b2-4234-94ab-4cc8994155c4-kube-api-access-dv6g4\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:17.619366 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:17.619190 2572 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/56cc276b-18b2-4234-94ab-4cc8994155c4-istiod-ca-cert\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:18.239634 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.239598 2572 generic.go:358] "Generic (PLEG): container finished" podID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerID="1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72" exitCode=0 Apr 21 15:04:18.239827 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.239661 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" Apr 21 15:04:18.239827 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.239667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" event={"ID":"56cc276b-18b2-4234-94ab-4cc8994155c4","Type":"ContainerDied","Data":"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72"} Apr 21 15:04:18.239827 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.239693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29" event={"ID":"56cc276b-18b2-4234-94ab-4cc8994155c4","Type":"ContainerDied","Data":"410234b22e1d78909ce2b3fa49deb6a8e9793fe6ffaea8079a8f9bf8f9c261d3"} Apr 21 15:04:18.239827 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.239711 2572 scope.go:117] "RemoveContainer" containerID="1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72" Apr 21 15:04:18.247643 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.247627 2572 scope.go:117] "RemoveContainer" containerID="1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72" Apr 21 15:04:18.247945 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:04:18.247921 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72\": container with ID starting with 1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72 not found: ID does not exist" containerID="1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72" Apr 21 15:04:18.248029 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.247955 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72"} err="failed to get container status \"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72\": rpc error: code = NotFound desc = could not find container \"1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72\": container with ID starting with 1d085c6a25c71588e5710a4af98549a6022b2725f5e1e8503c0d6d3c9f14ed72 not found: ID does not exist" Apr 21 15:04:18.259885 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.259860 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:18.264394 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:18.264373 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb77bnn29"] Apr 21 15:04:19.721432 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:19.721392 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" path="/var/lib/kubelet/pods/56cc276b-18b2-4234-94ab-4cc8994155c4/volumes" Apr 21 15:04:37.450621 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.450582 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:37.451022 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.450834 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" Apr 21 15:04:37.451022 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.450845 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" Apr 21 15:04:37.451022 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.450892 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="56cc276b-18b2-4234-94ab-4cc8994155c4" containerName="istio-proxy" Apr 21 15:04:37.453935 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.453900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:37.456818 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.456793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-4fvv8\"" Apr 21 15:04:37.456975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.456855 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:04:37.456975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.456929 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:04:37.466277 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.466246 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:37.559889 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.559850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ndg\" (UniqueName: \"kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg\") pod \"kuadrant-operator-catalog-x88hn\" (UID: \"efb1cfba-b628-498e-931d-0f7e4c053bca\") " pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:37.661210 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.661173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ndg\" (UniqueName: \"kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg\") pod \"kuadrant-operator-catalog-x88hn\" (UID: \"efb1cfba-b628-498e-931d-0f7e4c053bca\") " pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:37.673120 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.673091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ndg\" (UniqueName: \"kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg\") pod \"kuadrant-operator-catalog-x88hn\" (UID: \"efb1cfba-b628-498e-931d-0f7e4c053bca\") " pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:37.763518 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.763486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:37.961404 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:37.961374 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:37.962438 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:37.962410 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb1cfba_b628_498e_931d_0f7e4c053bca.slice/crio-c2bcfaa3daecfb4292fdaa5220f61d29339edef87c78bf8673a15914e10ec294 WatchSource:0}: Error finding container c2bcfaa3daecfb4292fdaa5220f61d29339edef87c78bf8673a15914e10ec294: Status 404 returned error can't find the container with id c2bcfaa3daecfb4292fdaa5220f61d29339edef87c78bf8673a15914e10ec294 Apr 21 15:04:38.104707 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.104622 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:38.154633 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.154598 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tkdp8"] Apr 21 15:04:38.157248 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.157232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:38.179270 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.179221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tkdp8"] Apr 21 15:04:38.266059 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.266023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tv5\" (UniqueName: \"kubernetes.io/projected/3937c90b-44ff-41e7-8184-abae38a17533-kube-api-access-d4tv5\") pod \"kuadrant-operator-catalog-tkdp8\" (UID: \"3937c90b-44ff-41e7-8184-abae38a17533\") " pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:38.306302 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.306267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" event={"ID":"efb1cfba-b628-498e-931d-0f7e4c053bca","Type":"ContainerStarted","Data":"c2bcfaa3daecfb4292fdaa5220f61d29339edef87c78bf8673a15914e10ec294"} Apr 21 15:04:38.366478 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.366389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tv5\" (UniqueName: \"kubernetes.io/projected/3937c90b-44ff-41e7-8184-abae38a17533-kube-api-access-d4tv5\") pod \"kuadrant-operator-catalog-tkdp8\" (UID: \"3937c90b-44ff-41e7-8184-abae38a17533\") " pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:38.376036 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.376009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tv5\" (UniqueName: \"kubernetes.io/projected/3937c90b-44ff-41e7-8184-abae38a17533-kube-api-access-d4tv5\") pod \"kuadrant-operator-catalog-tkdp8\" (UID: \"3937c90b-44ff-41e7-8184-abae38a17533\") " pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:38.466429 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.466384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:38.592850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:38.592822 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-tkdp8"] Apr 21 15:04:38.595060 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:38.595026 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3937c90b_44ff_41e7_8184_abae38a17533.slice/crio-c6b047bfdf9857586c2b3f7900364beb817e4b94677a52042f1fa40896eedb05 WatchSource:0}: Error finding container c6b047bfdf9857586c2b3f7900364beb817e4b94677a52042f1fa40896eedb05: Status 404 returned error can't find the container with id c6b047bfdf9857586c2b3f7900364beb817e4b94677a52042f1fa40896eedb05 Apr 21 15:04:39.312363 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:39.312316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" event={"ID":"3937c90b-44ff-41e7-8184-abae38a17533","Type":"ContainerStarted","Data":"c6b047bfdf9857586c2b3f7900364beb817e4b94677a52042f1fa40896eedb05"} Apr 21 15:04:41.320113 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.320064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" event={"ID":"3937c90b-44ff-41e7-8184-abae38a17533","Type":"ContainerStarted","Data":"d42ad9e41e513e339b5e1c9353eda94647d0a82dfdc3b6cf2dad2a5b9f57bd51"} Apr 21 15:04:41.321337 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.321313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" event={"ID":"efb1cfba-b628-498e-931d-0f7e4c053bca","Type":"ContainerStarted","Data":"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8"} Apr 21 15:04:41.321436 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.321385 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" podUID="efb1cfba-b628-498e-931d-0f7e4c053bca" containerName="registry-server" containerID="cri-o://b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8" gracePeriod=2 Apr 21 15:04:41.336520 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.336477 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" podStartSLOduration=1.335103985 podStartE2EDuration="3.336459241s" podCreationTimestamp="2026-04-21 15:04:38 +0000 UTC" firstStartedPulling="2026-04-21 15:04:38.596479458 +0000 UTC m=+521.450354898" lastFinishedPulling="2026-04-21 15:04:40.5978347 +0000 UTC m=+523.451710154" observedRunningTime="2026-04-21 15:04:41.335385879 +0000 UTC m=+524.189261343" watchObservedRunningTime="2026-04-21 15:04:41.336459241 +0000 UTC m=+524.190334703" Apr 21 15:04:41.351786 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.351734 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" podStartSLOduration=1.996575377 podStartE2EDuration="4.351715683s" podCreationTimestamp="2026-04-21 15:04:37 +0000 UTC" firstStartedPulling="2026-04-21 15:04:37.96361055 +0000 UTC m=+520.817485990" lastFinishedPulling="2026-04-21 15:04:40.318750856 +0000 UTC m=+523.172626296" observedRunningTime="2026-04-21 15:04:41.350358607 +0000 UTC m=+524.204234069" watchObservedRunningTime="2026-04-21 15:04:41.351715683 +0000 UTC m=+524.205591146" Apr 21 15:04:41.550242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.550220 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:41.593164 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.593088 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ndg\" (UniqueName: \"kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg\") pod \"efb1cfba-b628-498e-931d-0f7e4c053bca\" (UID: \"efb1cfba-b628-498e-931d-0f7e4c053bca\") " Apr 21 15:04:41.595204 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.595177 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg" (OuterVolumeSpecName: "kube-api-access-z5ndg") pod "efb1cfba-b628-498e-931d-0f7e4c053bca" (UID: "efb1cfba-b628-498e-931d-0f7e4c053bca"). InnerVolumeSpecName "kube-api-access-z5ndg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:41.694005 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:41.693970 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5ndg\" (UniqueName: \"kubernetes.io/projected/efb1cfba-b628-498e-931d-0f7e4c053bca-kube-api-access-z5ndg\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:42.325606 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.325572 2572 generic.go:358] "Generic (PLEG): container finished" podID="efb1cfba-b628-498e-931d-0f7e4c053bca" containerID="b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8" exitCode=0 Apr 21 15:04:42.326117 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.325630 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" Apr 21 15:04:42.326117 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.325664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" event={"ID":"efb1cfba-b628-498e-931d-0f7e4c053bca","Type":"ContainerDied","Data":"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8"} Apr 21 15:04:42.326117 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.325709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x88hn" event={"ID":"efb1cfba-b628-498e-931d-0f7e4c053bca","Type":"ContainerDied","Data":"c2bcfaa3daecfb4292fdaa5220f61d29339edef87c78bf8673a15914e10ec294"} Apr 21 15:04:42.326117 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.325743 2572 scope.go:117] "RemoveContainer" containerID="b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8" Apr 21 15:04:42.334470 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.334449 2572 scope.go:117] "RemoveContainer" containerID="b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8" Apr 21 15:04:42.334743 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:04:42.334724 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8\": container with ID starting with b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8 not found: ID does not exist" containerID="b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8" Apr 21 15:04:42.339106 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.334755 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8"} err="failed to get container status \"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8\": rpc error: code = NotFound desc = could not find container \"b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8\": container with ID starting with b5c2b4cb040d2cc67d5612e353ceb83fdb84c98196a612520c82c7009b8420e8 not found: ID does not exist" Apr 21 15:04:42.342740 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.342713 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:42.348774 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:42.348751 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x88hn"] Apr 21 15:04:43.721041 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:43.721008 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb1cfba-b628-498e-931d-0f7e4c053bca" path="/var/lib/kubelet/pods/efb1cfba-b628-498e-931d-0f7e4c053bca/volumes" Apr 21 15:04:48.467504 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:48.467412 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:48.467894 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:48.467706 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:48.489205 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:48.489177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:49.370194 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:49.370166 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-tkdp8" Apr 21 15:04:53.051895 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.051858 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4"] Apr 21 15:04:53.052376 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.052253 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efb1cfba-b628-498e-931d-0f7e4c053bca" containerName="registry-server" Apr 21 15:04:53.052376 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.052271 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb1cfba-b628-498e-931d-0f7e4c053bca" containerName="registry-server" Apr 21 15:04:53.052376 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.052333 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="efb1cfba-b628-498e-931d-0f7e4c053bca" containerName="registry-server" Apr 21 15:04:53.054315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.054295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.057194 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.057168 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdlh8\"" Apr 21 15:04:53.063006 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.062981 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4"] Apr 21 15:04:53.183712 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.183676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.183712 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.183717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7mw\" (UniqueName: \"kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.183950 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.183760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.284832 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.284792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.285039 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.284877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.285039 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.284929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7mw\" (UniqueName: \"kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.285201 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.285181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.285245 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.285208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.294342 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.294316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7mw\" (UniqueName: \"kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.364666 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.364579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:53.486987 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.486960 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4"] Apr 21 15:04:53.489714 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:53.489684 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1298291_6cb9_4bc9_86f1_429f59568a03.slice/crio-5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79 WatchSource:0}: Error finding container 5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79: Status 404 returned error can't find the container with id 5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79 Apr 21 15:04:53.674764 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.674683 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn"] Apr 21 15:04:53.677067 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.677048 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.690098 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.690057 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn"] Apr 21 15:04:53.788317 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.788279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.788498 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.788392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflzq\" (UniqueName: \"kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.788498 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.788448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.889787 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.889748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.889787 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.889787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.890038 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.889833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fflzq\" (UniqueName: \"kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.890253 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.890229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.890288 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.890237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.898503 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.898468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflzq\" (UniqueName: \"kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:53.986006 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:53.985974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:54.108482 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.108454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn"] Apr 21 15:04:54.110454 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:54.110423 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1291a95_eb90_40f2_b6be_eff3e3d9b3ef.slice/crio-003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071 WatchSource:0}: Error finding container 003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071: Status 404 returned error can't find the container with id 003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071 Apr 21 15:04:54.270593 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.270507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm"] Apr 21 15:04:54.273079 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.273063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.282952 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.282921 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm"] Apr 21 15:04:54.366654 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.366624 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerID="e11b541ac417808e81b8d54da281c3fa63eeef4f14e59d3c292ddabaccc87c11" exitCode=0 Apr 21 15:04:54.366849 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.366710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" event={"ID":"a1298291-6cb9-4bc9-86f1-429f59568a03","Type":"ContainerDied","Data":"e11b541ac417808e81b8d54da281c3fa63eeef4f14e59d3c292ddabaccc87c11"} Apr 21 15:04:54.366849 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.366754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" event={"ID":"a1298291-6cb9-4bc9-86f1-429f59568a03","Type":"ContainerStarted","Data":"5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79"} Apr 21 15:04:54.368117 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.368095 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerID="7e7d240746b1dcc1772f324d44ff6af645cd6366b22320915d1db14172eea006" exitCode=0 Apr 21 15:04:54.368216 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.368176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" event={"ID":"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef","Type":"ContainerDied","Data":"7e7d240746b1dcc1772f324d44ff6af645cd6366b22320915d1db14172eea006"} Apr 21 15:04:54.368216 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.368206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" event={"ID":"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef","Type":"ContainerStarted","Data":"003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071"} Apr 21 15:04:54.399444 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.395692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.399444 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.395751 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.399444 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.395857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nt5\" (UniqueName: \"kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.497038 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.496994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nt5\" (UniqueName: \"kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.497230 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.497205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.497314 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.497238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.497550 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.497532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.497603 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.497571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.506625 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.506597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nt5\" (UniqueName: \"kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.582963 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.582848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:54.658571 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.657875 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm"] Apr 21 15:04:54.661272 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.661244 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.670717 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.670650 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm"] Apr 21 15:04:54.720508 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.720480 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm"] Apr 21 15:04:54.721612 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:54.721578 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4862ae9d_8ac4_44ef_b3c2_cb0e986e2e15.slice/crio-a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c WatchSource:0}: Error finding container a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c: Status 404 returned error can't find the container with id a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c Apr 21 15:04:54.799812 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.799748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.800047 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.799837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zfk\" (UniqueName: \"kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.800047 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.799878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.900694 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.900591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zfk\" (UniqueName: \"kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.900694 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.900633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.900694 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.900696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.901144 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.901124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.901144 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.901134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.909835 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.909810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zfk\" (UniqueName: \"kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:54.972118 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:54.972080 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:55.108855 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.108831 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm"] Apr 21 15:04:55.152857 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:04:55.152782 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bf2bb0_cff5_47e5_87d7_ac222bcf6281.slice/crio-2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695 WatchSource:0}: Error finding container 2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695: Status 404 returned error can't find the container with id 2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695 Apr 21 15:04:55.373591 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.373554 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerID="69f803265e34b39bef43ba2bdf6b6272462014307ec2e8dcf00c69e8fe9510fa" exitCode=0 Apr 21 15:04:55.373790 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.373621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" event={"ID":"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef","Type":"ContainerDied","Data":"69f803265e34b39bef43ba2bdf6b6272462014307ec2e8dcf00c69e8fe9510fa"} Apr 21 15:04:55.375030 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.375004 2572 generic.go:358] "Generic (PLEG): container finished" podID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerID="4b4531f65699833e2a9cdc4fc1208439c2c2b47257512d72f07481b7dfebd33d" exitCode=0 Apr 21 15:04:55.375133 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.375032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" event={"ID":"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15","Type":"ContainerDied","Data":"4b4531f65699833e2a9cdc4fc1208439c2c2b47257512d72f07481b7dfebd33d"} Apr 21 15:04:55.375133 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.375067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" event={"ID":"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15","Type":"ContainerStarted","Data":"a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c"} Apr 21 15:04:55.376535 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.376511 2572 generic.go:358] "Generic (PLEG): container finished" podID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerID="e62144ec9786545d7e94febb0ea7bc76205bc375edd64412053023691f1fc7d9" exitCode=0 Apr 21 15:04:55.376604 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.376592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" event={"ID":"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281","Type":"ContainerDied","Data":"e62144ec9786545d7e94febb0ea7bc76205bc375edd64412053023691f1fc7d9"} Apr 21 15:04:55.376651 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:55.376615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" event={"ID":"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281","Type":"ContainerStarted","Data":"2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695"} Apr 21 15:04:56.382457 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.382389 2572 generic.go:358] "Generic (PLEG): container finished" podID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerID="2e1ef982364a7c5358818b572dd29979f4a0d0d7e35410743c5186afc4777539" exitCode=0 Apr 21 15:04:56.382849 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.382472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" event={"ID":"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef","Type":"ContainerDied","Data":"2e1ef982364a7c5358818b572dd29979f4a0d0d7e35410743c5186afc4777539"} Apr 21 15:04:56.384039 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.384015 2572 generic.go:358] "Generic (PLEG): container finished" podID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerID="0876a379f1d4004b5e2fab0d13cfedce04fa3e26f03b8bfbace5df0dbf1fac2d" exitCode=0 Apr 21 15:04:56.384146 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.384093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" event={"ID":"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281","Type":"ContainerDied","Data":"0876a379f1d4004b5e2fab0d13cfedce04fa3e26f03b8bfbace5df0dbf1fac2d"} Apr 21 15:04:56.385862 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.385837 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerID="7e8f245fe3f062148dbbd5e5ccf0d2ff1d4f2ccd8ec4a6d7b25b708a0b491e18" exitCode=0 Apr 21 15:04:56.385980 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:56.385882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" event={"ID":"a1298291-6cb9-4bc9-86f1-429f59568a03","Type":"ContainerDied","Data":"7e8f245fe3f062148dbbd5e5ccf0d2ff1d4f2ccd8ec4a6d7b25b708a0b491e18"} Apr 21 15:04:57.391492 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.391456 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerID="bf64b72fdfae568370691117ca21fe00bc59930b5d6c14cc1c14e4df53cf8499" exitCode=0 Apr 21 15:04:57.391898 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.391535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" event={"ID":"a1298291-6cb9-4bc9-86f1-429f59568a03","Type":"ContainerDied","Data":"bf64b72fdfae568370691117ca21fe00bc59930b5d6c14cc1c14e4df53cf8499"} Apr 21 15:04:57.393107 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.393080 2572 generic.go:358] "Generic (PLEG): container finished" podID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerID="5f0a22a1e669e9c41129d1fbb5d12a6bf967e675cf8fc06ccbedadf4d7d5ba3a" exitCode=0 Apr 21 15:04:57.393223 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.393164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" event={"ID":"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15","Type":"ContainerDied","Data":"5f0a22a1e669e9c41129d1fbb5d12a6bf967e675cf8fc06ccbedadf4d7d5ba3a"} Apr 21 15:04:57.395017 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.394998 2572 generic.go:358] "Generic (PLEG): container finished" podID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerID="b83c3d12538a746891e703d1b9f47c08b3a059a6580e3c30588160fb2f1844b5" exitCode=0 Apr 21 15:04:57.395088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.395022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" event={"ID":"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281","Type":"ContainerDied","Data":"b83c3d12538a746891e703d1b9f47c08b3a059a6580e3c30588160fb2f1844b5"} Apr 21 15:04:57.523598 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.523575 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:57.623798 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.623712 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util\") pod \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " Apr 21 15:04:57.623948 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.623832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle\") pod \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " Apr 21 15:04:57.623948 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.623891 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflzq\" (UniqueName: \"kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq\") pod \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\" (UID: \"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef\") " Apr 21 15:04:57.624354 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.624322 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle" (OuterVolumeSpecName: "bundle") pod "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" (UID: "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:57.625951 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.625895 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq" (OuterVolumeSpecName: "kube-api-access-fflzq") pod "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" (UID: "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef"). InnerVolumeSpecName "kube-api-access-fflzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:57.628587 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.628566 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util" (OuterVolumeSpecName: "util") pod "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" (UID: "c1291a95-eb90-40f2-b6be-eff3e3d9b3ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:57.724537 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.724505 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:57.724537 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.724528 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:57.724537 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:57.724537 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fflzq\" (UniqueName: \"kubernetes.io/projected/c1291a95-eb90-40f2-b6be-eff3e3d9b3ef-kube-api-access-fflzq\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.400175 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.400135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" event={"ID":"c1291a95-eb90-40f2-b6be-eff3e3d9b3ef","Type":"ContainerDied","Data":"003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071"} Apr 21 15:04:58.400175 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.400174 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003e9ddc24c041c5e6ba4dd8b81c7bfaafa586718cfc4eb0ab71832573e20071" Apr 21 15:04:58.400628 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.400200 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn" Apr 21 15:04:58.402275 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.402191 2572 generic.go:358] "Generic (PLEG): container finished" podID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerID="1e83e3d00631fe8033544fa76b4037f4320cf4bf9cbd19817c132ab8360671e2" exitCode=0 Apr 21 15:04:58.402483 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.402461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" event={"ID":"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15","Type":"ContainerDied","Data":"1e83e3d00631fe8033544fa76b4037f4320cf4bf9cbd19817c132ab8360671e2"} Apr 21 15:04:58.550129 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.550073 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:58.552835 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.552816 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:58.632114 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632083 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle\") pod \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " Apr 21 15:04:58.632114 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632122 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle\") pod \"a1298291-6cb9-4bc9-86f1-429f59568a03\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " Apr 21 15:04:58.632350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632158 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zfk\" (UniqueName: \"kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk\") pod \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " Apr 21 15:04:58.632350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util\") pod \"a1298291-6cb9-4bc9-86f1-429f59568a03\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " Apr 21 15:04:58.632350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632214 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util\") pod \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\" (UID: \"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281\") " Apr 21 15:04:58.632350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632246 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb7mw\" (UniqueName: \"kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw\") pod \"a1298291-6cb9-4bc9-86f1-429f59568a03\" (UID: \"a1298291-6cb9-4bc9-86f1-429f59568a03\") " Apr 21 15:04:58.632785 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632754 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle" (OuterVolumeSpecName: "bundle") pod "a1298291-6cb9-4bc9-86f1-429f59568a03" (UID: "a1298291-6cb9-4bc9-86f1-429f59568a03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:58.632944 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.632791 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle" (OuterVolumeSpecName: "bundle") pod "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" (UID: "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:58.634341 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.634318 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk" (OuterVolumeSpecName: "kube-api-access-h7zfk") pod "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" (UID: "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281"). InnerVolumeSpecName "kube-api-access-h7zfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:58.634457 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.634343 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw" (OuterVolumeSpecName: "kube-api-access-kb7mw") pod "a1298291-6cb9-4bc9-86f1-429f59568a03" (UID: "a1298291-6cb9-4bc9-86f1-429f59568a03"). InnerVolumeSpecName "kube-api-access-kb7mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:58.637757 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.637725 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util" (OuterVolumeSpecName: "util") pod "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" (UID: "b8bf2bb0-cff5-47e5-87d7-ac222bcf6281"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:58.653328 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.653307 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util" (OuterVolumeSpecName: "util") pod "a1298291-6cb9-4bc9-86f1-429f59568a03" (UID: "a1298291-6cb9-4bc9-86f1-429f59568a03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:58.732766 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732730 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.732766 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732759 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.732970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732772 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7zfk\" (UniqueName: \"kubernetes.io/projected/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-kube-api-access-h7zfk\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.732970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732786 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1298291-6cb9-4bc9-86f1-429f59568a03-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.732970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732798 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8bf2bb0-cff5-47e5-87d7-ac222bcf6281-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:58.732970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:58.732809 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb7mw\" (UniqueName: \"kubernetes.io/projected/a1298291-6cb9-4bc9-86f1-429f59568a03-kube-api-access-kb7mw\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:59.408231 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.408200 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" Apr 21 15:04:59.408231 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.408213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm" event={"ID":"b8bf2bb0-cff5-47e5-87d7-ac222bcf6281","Type":"ContainerDied","Data":"2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695"} Apr 21 15:04:59.408683 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.408256 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3e02dbd8c3b7c74f3039201e2fe3055963fc44826d6f6949bcfea9cf40a695" Apr 21 15:04:59.410026 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.410005 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" Apr 21 15:04:59.410026 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.410012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4" event={"ID":"a1298291-6cb9-4bc9-86f1-429f59568a03","Type":"ContainerDied","Data":"5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79"} Apr 21 15:04:59.410193 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.410042 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d1921ca0488c8e9612acba746cb2e74aa3765309795d06c2c7d0c3652770a79" Apr 21 15:04:59.529390 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.529367 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:04:59.639978 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.639852 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9nt5\" (UniqueName: \"kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5\") pod \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " Apr 21 15:04:59.639978 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.639957 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util\") pod \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " Apr 21 15:04:59.640224 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.640012 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle\") pod \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\" (UID: \"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15\") " Apr 21 15:04:59.640458 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.640433 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle" (OuterVolumeSpecName: "bundle") pod "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" (UID: "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:59.641952 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.641923 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5" (OuterVolumeSpecName: "kube-api-access-c9nt5") pod "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" (UID: "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15"). InnerVolumeSpecName "kube-api-access-c9nt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:04:59.645129 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.645089 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util" (OuterVolumeSpecName: "util") pod "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" (UID: "4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:04:59.740756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.740724 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:59.740756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.740753 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:04:59.740946 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:04:59.740763 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9nt5\" (UniqueName: \"kubernetes.io/projected/4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15-kube-api-access-c9nt5\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:05:00.415088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:00.415052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" event={"ID":"4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15","Type":"ContainerDied","Data":"a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c"} Apr 21 15:05:00.415088 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:00.415086 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76f6bf910501ff5b85841efc9a4e8e5af949468202888ae5dfe61dd30e9414c" Apr 21 15:05:00.415503 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:00.415125 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm" Apr 21 15:05:09.407750 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.407707 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs"] Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.407989 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408002 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408014 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408020 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408028 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408033 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408038 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408043 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408048 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408053 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408059 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408063 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408068 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408072 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408080 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408084 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408089 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408094 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408103 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408107 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="pull" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408116 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="extract" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408128 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408132 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="util" Apr 21 15:05:09.408156 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408167 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1298291-6cb9-4bc9-86f1-429f59568a03" containerName="extract" Apr 21 15:05:09.408899 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408177 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1291a95-eb90-40f2-b6be-eff3e3d9b3ef" containerName="extract" Apr 21 15:05:09.408899 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408183 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15" containerName="extract" Apr 21 15:05:09.408899 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.408188 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8bf2bb0-cff5-47e5-87d7-ac222bcf6281" containerName="extract" Apr 21 15:05:09.411173 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.411155 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:09.414711 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.414683 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-wpt7k\"" Apr 21 15:05:09.420594 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.420563 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs"] Apr 21 15:05:09.513839 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.513800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkpss\" (UniqueName: \"kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss\") pod \"limitador-operator-controller-manager-85c4996f8c-b74hs\" (UID: \"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:09.615115 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.615078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkpss\" (UniqueName: \"kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss\") pod \"limitador-operator-controller-manager-85c4996f8c-b74hs\" (UID: \"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:09.631782 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.631749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkpss\" (UniqueName: \"kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss\") pod \"limitador-operator-controller-manager-85c4996f8c-b74hs\" (UID: \"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:09.722587 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.722560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:09.856946 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:09.856919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs"] Apr 21 15:05:09.859096 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:05:09.859066 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba733b0c_2f9e_43a1_9cc1_b4aebcdb10f6.slice/crio-1fe6a0ea625aec252978f2695b4ba0a18ee13ec1ca33cc2f2cbbf6d8d7514657 WatchSource:0}: Error finding container 1fe6a0ea625aec252978f2695b4ba0a18ee13ec1ca33cc2f2cbbf6d8d7514657: Status 404 returned error can't find the container with id 1fe6a0ea625aec252978f2695b4ba0a18ee13ec1ca33cc2f2cbbf6d8d7514657 Apr 21 15:05:10.452001 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:10.451964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" event={"ID":"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6","Type":"ContainerStarted","Data":"1fe6a0ea625aec252978f2695b4ba0a18ee13ec1ca33cc2f2cbbf6d8d7514657"} Apr 21 15:05:12.460219 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.460180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" event={"ID":"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6","Type":"ContainerStarted","Data":"729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0"} Apr 21 15:05:12.460680 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.460299 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:12.480535 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.480436 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" podStartSLOduration=1.123388476 podStartE2EDuration="3.480420562s" podCreationTimestamp="2026-04-21 15:05:09 +0000 UTC" firstStartedPulling="2026-04-21 15:05:09.860957232 +0000 UTC m=+552.714832672" lastFinishedPulling="2026-04-21 15:05:12.217989316 +0000 UTC m=+555.071864758" observedRunningTime="2026-04-21 15:05:12.47836622 +0000 UTC m=+555.332241703" watchObservedRunningTime="2026-04-21 15:05:12.480420562 +0000 UTC m=+555.334296024" Apr 21 15:05:12.829595 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.829503 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74"] Apr 21 15:05:12.832667 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.832649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:12.835454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.835432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-s4xg7\"" Apr 21 15:05:12.835573 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.835432 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 15:05:12.845188 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.845160 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74"] Apr 21 15:05:12.940151 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:12.940119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9wl\" (UniqueName: \"kubernetes.io/projected/e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7-kube-api-access-2p9wl\") pod \"dns-operator-controller-manager-648d5c98bc-dbc74\" (UID: \"e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:13.040931 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:13.040869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9wl\" (UniqueName: \"kubernetes.io/projected/e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7-kube-api-access-2p9wl\") pod \"dns-operator-controller-manager-648d5c98bc-dbc74\" (UID: \"e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:13.050367 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:13.050332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9wl\" (UniqueName: \"kubernetes.io/projected/e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7-kube-api-access-2p9wl\") pod \"dns-operator-controller-manager-648d5c98bc-dbc74\" (UID: \"e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:13.142282 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:13.142179 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:13.305497 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:13.305475 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74"] Apr 21 15:05:13.307730 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:05:13.307696 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode147e374_9b6a_4c0c_8db9_fbe6ea3f1ea7.slice/crio-92d683ec76f336864509842a75078c89cf6b1c7ad51aefa0fa2fddebbdb5362f WatchSource:0}: Error finding container 92d683ec76f336864509842a75078c89cf6b1c7ad51aefa0fa2fddebbdb5362f: Status 404 returned error can't find the container with id 92d683ec76f336864509842a75078c89cf6b1c7ad51aefa0fa2fddebbdb5362f Apr 21 15:05:13.464948 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:13.464887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" event={"ID":"e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7","Type":"ContainerStarted","Data":"92d683ec76f336864509842a75078c89cf6b1c7ad51aefa0fa2fddebbdb5362f"} Apr 21 15:05:17.484822 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:17.484787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" event={"ID":"e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7","Type":"ContainerStarted","Data":"8e37a5c7338a23aebe3df1d470cae2ea5ae04280675d529bd99f9f2fb5d1fadb"} Apr 21 15:05:17.485250 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:17.484840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:17.507498 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:17.507442 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" podStartSLOduration=2.071558735 podStartE2EDuration="5.507425931s" podCreationTimestamp="2026-04-21 15:05:12 +0000 UTC" firstStartedPulling="2026-04-21 15:05:13.310007055 +0000 UTC m=+556.163882494" lastFinishedPulling="2026-04-21 15:05:16.745874247 +0000 UTC m=+559.599749690" observedRunningTime="2026-04-21 15:05:17.505425231 +0000 UTC m=+560.359300693" watchObservedRunningTime="2026-04-21 15:05:17.507425931 +0000 UTC m=+560.361301393" Apr 21 15:05:21.673309 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.673267 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-9q69c"] Apr 21 15:05:21.676527 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.676510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:21.680300 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.680279 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-8bw9g\"" Apr 21 15:05:21.691871 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.691848 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-9q69c"] Apr 21 15:05:21.812425 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.812380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwl5\" (UniqueName: \"kubernetes.io/projected/b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6-kube-api-access-rnwl5\") pod \"authorino-operator-657f44b778-9q69c\" (UID: \"b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6\") " pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:21.913676 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.913640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwl5\" (UniqueName: \"kubernetes.io/projected/b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6-kube-api-access-rnwl5\") pod \"authorino-operator-657f44b778-9q69c\" (UID: \"b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6\") " pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:21.925404 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.925339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwl5\" (UniqueName: \"kubernetes.io/projected/b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6-kube-api-access-rnwl5\") pod \"authorino-operator-657f44b778-9q69c\" (UID: \"b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6\") " pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:21.985940 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:21.985889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:22.109850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:22.109825 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-9q69c"] Apr 21 15:05:22.111883 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:05:22.111856 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c0319d_728e_4fe5_ac74_4bb6f0c1f9c6.slice/crio-4cd5b34d31a615ffbc6d2797ab589e5cbabba2412e0a656a41f23f2e5d8ffd1e WatchSource:0}: Error finding container 4cd5b34d31a615ffbc6d2797ab589e5cbabba2412e0a656a41f23f2e5d8ffd1e: Status 404 returned error can't find the container with id 4cd5b34d31a615ffbc6d2797ab589e5cbabba2412e0a656a41f23f2e5d8ffd1e Apr 21 15:05:22.504133 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:22.504098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" event={"ID":"b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6","Type":"ContainerStarted","Data":"4cd5b34d31a615ffbc6d2797ab589e5cbabba2412e0a656a41f23f2e5d8ffd1e"} Apr 21 15:05:23.468177 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:23.468147 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:24.513212 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:24.513169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" event={"ID":"b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6","Type":"ContainerStarted","Data":"7c479da2ea310e49763b692858895c2cd1eb5eb1f8719cf22e053221843e600b"} Apr 21 15:05:24.513570 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:24.513282 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:24.532034 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:24.531983 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" podStartSLOduration=1.234503773 podStartE2EDuration="3.531967687s" podCreationTimestamp="2026-04-21 15:05:21 +0000 UTC" firstStartedPulling="2026-04-21 15:05:22.113990649 +0000 UTC m=+564.967866092" lastFinishedPulling="2026-04-21 15:05:24.411454563 +0000 UTC m=+567.265330006" observedRunningTime="2026-04-21 15:05:24.530199933 +0000 UTC m=+567.384075395" watchObservedRunningTime="2026-04-21 15:05:24.531967687 +0000 UTC m=+567.385843148" Apr 21 15:05:27.482605 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.482567 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn"] Apr 21 15:05:27.487035 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.487010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.489665 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.489643 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 15:05:27.489790 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.489706 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdlh8\"" Apr 21 15:05:27.489919 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.489893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 15:05:27.494032 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.493954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn"] Apr 21 15:05:27.664813 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.664775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dmq\" (UniqueName: \"kubernetes.io/projected/fc9b2af3-d5a7-4f3d-829c-842f23393991-kube-api-access-44dmq\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.664813 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.664821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b2af3-d5a7-4f3d-829c-842f23393991-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.665044 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.664924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc9b2af3-d5a7-4f3d-829c-842f23393991-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.765698 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.765625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44dmq\" (UniqueName: \"kubernetes.io/projected/fc9b2af3-d5a7-4f3d-829c-842f23393991-kube-api-access-44dmq\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.765698 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.765661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b2af3-d5a7-4f3d-829c-842f23393991-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.765964 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.765942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc9b2af3-d5a7-4f3d-829c-842f23393991-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.766532 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.766510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fc9b2af3-d5a7-4f3d-829c-842f23393991-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.768046 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.768030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b2af3-d5a7-4f3d-829c-842f23393991-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.775110 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.775089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dmq\" (UniqueName: \"kubernetes.io/projected/fc9b2af3-d5a7-4f3d-829c-842f23393991-kube-api-access-44dmq\") pod \"kuadrant-console-plugin-6cb54b5c86-9nhbn\" (UID: \"fc9b2af3-d5a7-4f3d-829c-842f23393991\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.796935 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.796898 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" Apr 21 15:05:27.919060 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:27.919033 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn"] Apr 21 15:05:27.921069 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:05:27.921037 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc9b2af3_d5a7_4f3d_829c_842f23393991.slice/crio-9bc89a370c0d3a5edfa379e36e94cf665ef266a6aae8efa33f3df32acba638c4 WatchSource:0}: Error finding container 9bc89a370c0d3a5edfa379e36e94cf665ef266a6aae8efa33f3df32acba638c4: Status 404 returned error can't find the container with id 9bc89a370c0d3a5edfa379e36e94cf665ef266a6aae8efa33f3df32acba638c4 Apr 21 15:05:28.492809 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:28.492780 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-dbc74" Apr 21 15:05:28.527548 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:28.527508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" event={"ID":"fc9b2af3-d5a7-4f3d-829c-842f23393991","Type":"ContainerStarted","Data":"9bc89a370c0d3a5edfa379e36e94cf665ef266a6aae8efa33f3df32acba638c4"} Apr 21 15:05:35.518817 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:35.518790 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-9q69c" Apr 21 15:05:38.291666 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.291636 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs"] Apr 21 15:05:38.292097 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.291879 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" containerName="manager" containerID="cri-o://729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0" gracePeriod=2 Apr 21 15:05:38.304509 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.304453 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs"] Apr 21 15:05:38.320804 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.320777 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7"] Apr 21 15:05:38.321393 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.321171 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" containerName="manager" Apr 21 15:05:38.321393 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.321189 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" containerName="manager" Apr 21 15:05:38.321393 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.321302 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" containerName="manager" Apr 21 15:05:38.324524 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.324507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:38.336390 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.336350 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7"] Apr 21 15:05:38.347797 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.347763 2572 status_manager.go:895] "Failed to get status for pod" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" err="pods \"limitador-operator-controller-manager-85c4996f8c-b74hs\" is forbidden: User \"system:node:ip-10-0-134-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-40.ec2.internal' and this object" Apr 21 15:05:38.353017 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.352991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vgl\" (UniqueName: \"kubernetes.io/projected/1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae-kube-api-access-x9vgl\") pod \"limitador-operator-controller-manager-85c4996f8c-8pph7\" (UID: \"1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:38.454438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.454398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vgl\" (UniqueName: \"kubernetes.io/projected/1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae-kube-api-access-x9vgl\") pod \"limitador-operator-controller-manager-85c4996f8c-8pph7\" (UID: \"1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:38.463000 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.462960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vgl\" (UniqueName: \"kubernetes.io/projected/1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae-kube-api-access-x9vgl\") pod \"limitador-operator-controller-manager-85c4996f8c-8pph7\" (UID: \"1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:38.524711 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.524686 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:38.529263 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.529235 2572 status_manager.go:895] "Failed to get status for pod" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" err="pods \"limitador-operator-controller-manager-85c4996f8c-b74hs\" is forbidden: User \"system:node:ip-10-0-134-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-40.ec2.internal' and this object" Apr 21 15:05:38.555742 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.555682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkpss\" (UniqueName: \"kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss\") pod \"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6\" (UID: \"ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6\") " Apr 21 15:05:38.557878 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.557848 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss" (OuterVolumeSpecName: "kube-api-access-jkpss") pod "ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" (UID: "ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6"). InnerVolumeSpecName "kube-api-access-jkpss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:05:38.565422 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.565395 2572 generic.go:358] "Generic (PLEG): container finished" podID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" containerID="729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0" exitCode=0 Apr 21 15:05:38.565510 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.565461 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" Apr 21 15:05:38.565510 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.565473 2572 scope.go:117] "RemoveContainer" containerID="729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0" Apr 21 15:05:38.568146 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.568112 2572 status_manager.go:895] "Failed to get status for pod" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" err="pods \"limitador-operator-controller-manager-85c4996f8c-b74hs\" is forbidden: User \"system:node:ip-10-0-134-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-40.ec2.internal' and this object" Apr 21 15:05:38.576354 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.576043 2572 scope.go:117] "RemoveContainer" containerID="729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0" Apr 21 15:05:38.576551 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:05:38.576525 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0\": container with ID starting with 729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0 not found: ID does not exist" containerID="729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0" Apr 21 15:05:38.576622 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.576563 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0"} err="failed to get container status \"729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0\": rpc error: code = NotFound desc = could not find container \"729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0\": container with ID starting with 729599c1db6a6dabfcb3694a7177c223e08328b697225c4490eed3ed82eedca0 not found: ID does not exist" Apr 21 15:05:38.579750 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.579721 2572 status_manager.go:895] "Failed to get status for pod" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" err="pods \"limitador-operator-controller-manager-85c4996f8c-b74hs\" is forbidden: User \"system:node:ip-10-0-134-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-40.ec2.internal' and this object" Apr 21 15:05:38.657015 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.656984 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkpss\" (UniqueName: \"kubernetes.io/projected/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6-kube-api-access-jkpss\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:05:38.662817 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.662792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:38.788554 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:38.788502 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7"] Apr 21 15:05:38.791210 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:05:38.791184 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a5a9844_0a16_41c9_9ff1_9c2a8bc5ebae.slice/crio-e748379acd564946a2d96f30236c975558f0c35a666d13382dc58a4d7d1ef942 WatchSource:0}: Error finding container e748379acd564946a2d96f30236c975558f0c35a666d13382dc58a4d7d1ef942: Status 404 returned error can't find the container with id e748379acd564946a2d96f30236c975558f0c35a666d13382dc58a4d7d1ef942 Apr 21 15:05:39.572187 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:39.572150 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" event={"ID":"1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae","Type":"ContainerStarted","Data":"029c51316e856ac48efeac9c898ae08ddce6d1b363b32cd37bb024b39211a448"} Apr 21 15:05:39.572187 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:39.572191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" event={"ID":"1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae","Type":"ContainerStarted","Data":"e748379acd564946a2d96f30236c975558f0c35a666d13382dc58a4d7d1ef942"} Apr 21 15:05:39.572685 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:39.572292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:39.615598 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:39.615563 2572 status_manager.go:895] "Failed to get status for pod" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-b74hs" err="pods \"limitador-operator-controller-manager-85c4996f8c-b74hs\" is forbidden: User \"system:node:ip-10-0-134-40.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-40.ec2.internal' and this object" Apr 21 15:05:39.722242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:39.722199 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6" path="/var/lib/kubelet/pods/ba733b0c-2f9e-43a1-9cc1-b4aebcdb10f6/volumes" Apr 21 15:05:50.578859 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:50.578819 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" Apr 21 15:05:50.609826 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:50.609757 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8pph7" podStartSLOduration=12.609735294 podStartE2EDuration="12.609735294s" podCreationTimestamp="2026-04-21 15:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:05:39.612524476 +0000 UTC m=+582.466399937" watchObservedRunningTime="2026-04-21 15:05:50.609735294 +0000 UTC m=+593.463610758" Apr 21 15:05:55.635397 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:55.635360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" event={"ID":"fc9b2af3-d5a7-4f3d-829c-842f23393991","Type":"ContainerStarted","Data":"8253b56e0116319c0bc34196177ae11ea6629960cea8485c12b23ff4b10381d8"} Apr 21 15:05:55.652305 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:55.652255 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-9nhbn" podStartSLOduration=1.58923145 podStartE2EDuration="28.652239703s" podCreationTimestamp="2026-04-21 15:05:27 +0000 UTC" firstStartedPulling="2026-04-21 15:05:27.922348864 +0000 UTC m=+570.776224305" lastFinishedPulling="2026-04-21 15:05:54.985357115 +0000 UTC m=+597.839232558" observedRunningTime="2026-04-21 15:05:55.649803453 +0000 UTC m=+598.503678914" watchObservedRunningTime="2026-04-21 15:05:55.652239703 +0000 UTC m=+598.506115166" Apr 21 15:05:57.628570 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:57.628540 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:05:57.629031 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:05:57.628593 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:06:22.155019 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.154939 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:22.209401 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.209360 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:22.209618 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.209485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:22.212425 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.212402 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wqcsw\"" Apr 21 15:06:22.320416 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.320369 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhzln\" (UniqueName: \"kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln\") pod \"authorino-f99f4b5cd-wx72l\" (UID: \"4a1006b9-b5d7-474d-a97a-8d268929ef3d\") " pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:22.421491 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.421416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhzln\" (UniqueName: \"kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln\") pod \"authorino-f99f4b5cd-wx72l\" (UID: \"4a1006b9-b5d7-474d-a97a-8d268929ef3d\") " pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:22.431193 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.431168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhzln\" (UniqueName: \"kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln\") pod \"authorino-f99f4b5cd-wx72l\" (UID: \"4a1006b9-b5d7-474d-a97a-8d268929ef3d\") " pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:22.517958 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.517920 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:22.650060 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.649994 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:22.652360 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:06:22.652330 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a1006b9_b5d7_474d_a97a_8d268929ef3d.slice/crio-30fe029fd45838518038b086190863a7404564fb944207ac994c662ef5f36d64 WatchSource:0}: Error finding container 30fe029fd45838518038b086190863a7404564fb944207ac994c662ef5f36d64: Status 404 returned error can't find the container with id 30fe029fd45838518038b086190863a7404564fb944207ac994c662ef5f36d64 Apr 21 15:06:22.733298 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:22.733261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" event={"ID":"4a1006b9-b5d7-474d-a97a-8d268929ef3d","Type":"ContainerStarted","Data":"30fe029fd45838518038b086190863a7404564fb944207ac994c662ef5f36d64"} Apr 21 15:06:26.755988 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:26.755947 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" event={"ID":"4a1006b9-b5d7-474d-a97a-8d268929ef3d","Type":"ContainerStarted","Data":"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887"} Apr 21 15:06:26.788916 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:26.788855 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" podStartSLOduration=1.199741576 podStartE2EDuration="4.788840365s" podCreationTimestamp="2026-04-21 15:06:22 +0000 UTC" firstStartedPulling="2026-04-21 15:06:22.653642365 +0000 UTC m=+625.507517809" lastFinishedPulling="2026-04-21 15:06:26.242741144 +0000 UTC m=+629.096616598" observedRunningTime="2026-04-21 15:06:26.787000705 +0000 UTC m=+629.640876169" watchObservedRunningTime="2026-04-21 15:06:26.788840365 +0000 UTC m=+629.642715827" Apr 21 15:06:26.806551 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:26.806518 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:28.761229 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:28.761188 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" podUID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" containerName="authorino" containerID="cri-o://0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887" gracePeriod=30 Apr 21 15:06:29.002253 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.002230 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:29.162728 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.162632 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhzln\" (UniqueName: \"kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln\") pod \"4a1006b9-b5d7-474d-a97a-8d268929ef3d\" (UID: \"4a1006b9-b5d7-474d-a97a-8d268929ef3d\") " Apr 21 15:06:29.164661 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.164636 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln" (OuterVolumeSpecName: "kube-api-access-dhzln") pod "4a1006b9-b5d7-474d-a97a-8d268929ef3d" (UID: "4a1006b9-b5d7-474d-a97a-8d268929ef3d"). InnerVolumeSpecName "kube-api-access-dhzln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:06:29.264018 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.263977 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhzln\" (UniqueName: \"kubernetes.io/projected/4a1006b9-b5d7-474d-a97a-8d268929ef3d-kube-api-access-dhzln\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:06:29.765564 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.765531 2572 generic.go:358] "Generic (PLEG): container finished" podID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" containerID="0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887" exitCode=0 Apr 21 15:06:29.766090 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.765603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" event={"ID":"4a1006b9-b5d7-474d-a97a-8d268929ef3d","Type":"ContainerDied","Data":"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887"} Apr 21 15:06:29.766090 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.765607 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" Apr 21 15:06:29.766090 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.765628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-wx72l" event={"ID":"4a1006b9-b5d7-474d-a97a-8d268929ef3d","Type":"ContainerDied","Data":"30fe029fd45838518038b086190863a7404564fb944207ac994c662ef5f36d64"} Apr 21 15:06:29.766090 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.765642 2572 scope.go:117] "RemoveContainer" containerID="0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887" Apr 21 15:06:29.773653 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.773636 2572 scope.go:117] "RemoveContainer" containerID="0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887" Apr 21 15:06:29.773952 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:06:29.773930 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887\": container with ID starting with 0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887 not found: ID does not exist" containerID="0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887" Apr 21 15:06:29.774053 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.773959 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887"} err="failed to get container status \"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887\": rpc error: code = NotFound desc = could not find container \"0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887\": container with ID starting with 0536e1a2e45d8a0936447e8e2e6f4f873e2ccdcd3583426432ddd3a5e867e887 not found: ID does not exist" Apr 21 15:06:29.784678 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.784645 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:29.786758 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:29.786729 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-wx72l"] Apr 21 15:06:31.721725 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:31.721678 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" path="/var/lib/kubelet/pods/4a1006b9-b5d7-474d-a97a-8d268929ef3d/volumes" Apr 21 15:06:56.922454 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.922419 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc"] Apr 21 15:06:56.922937 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.922714 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" containerName="authorino" Apr 21 15:06:56.922937 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.922726 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" containerName="authorino" Apr 21 15:06:56.922937 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.922787 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a1006b9-b5d7-474d-a97a-8d268929ef3d" containerName="authorino" Apr 21 15:06:56.925005 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.924989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:56.928679 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.928653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:06:56.928819 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.928747 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9s5sp\"" Apr 21 15:06:56.929968 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.929935 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:06:56.935701 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.935677 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc"] Apr 21 15:06:56.941612 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.941587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:56.941708 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.941644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:56.941757 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:56.941719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlz9g\" (UniqueName: \"kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.043148 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.043097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.043346 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.043170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.043346 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.043210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlz9g\" (UniqueName: \"kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.043565 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.043542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.043613 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.043555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.053558 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.053525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlz9g\" (UniqueName: \"kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.233760 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.233725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:06:57.381010 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.380986 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc"] Apr 21 15:06:57.382900 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:06:57.382867 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e5483ac_647f_48f7_8da4_49bbd5831263.slice/crio-821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd WatchSource:0}: Error finding container 821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd: Status 404 returned error can't find the container with id 821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd Apr 21 15:06:57.384733 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.384717 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:06:57.863874 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.863786 2572 generic.go:358] "Generic (PLEG): container finished" podID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerID="3f7f0702c9d32362344c5d450fd4edb8aec5f5976bae61b49982c2d11c84c8bf" exitCode=0 Apr 21 15:06:57.864041 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.863871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" event={"ID":"0e5483ac-647f-48f7-8da4-49bbd5831263","Type":"ContainerDied","Data":"3f7f0702c9d32362344c5d450fd4edb8aec5f5976bae61b49982c2d11c84c8bf"} Apr 21 15:06:57.864041 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:57.863928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" event={"ID":"0e5483ac-647f-48f7-8da4-49bbd5831263","Type":"ContainerStarted","Data":"821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd"} Apr 21 15:06:58.868587 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:58.868560 2572 generic.go:358] "Generic (PLEG): container finished" podID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerID="974eb1fc70b87f43486f93af343b1786f62fa2f6eb6d33dbff475e0258019410" exitCode=0 Apr 21 15:06:58.868960 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:58.868626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" event={"ID":"0e5483ac-647f-48f7-8da4-49bbd5831263","Type":"ContainerDied","Data":"974eb1fc70b87f43486f93af343b1786f62fa2f6eb6d33dbff475e0258019410"} Apr 21 15:06:59.874012 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:59.873978 2572 generic.go:358] "Generic (PLEG): container finished" podID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerID="f8931aa9a4448b5a86c05c1fddfba7363a70128896c6da713cb5125eddf1594c" exitCode=0 Apr 21 15:06:59.874405 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:06:59.874051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" event={"ID":"0e5483ac-647f-48f7-8da4-49bbd5831263","Type":"ContainerDied","Data":"f8931aa9a4448b5a86c05c1fddfba7363a70128896c6da713cb5125eddf1594c"} Apr 21 15:07:00.997922 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:00.997886 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:07:01.072558 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.072536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util\") pod \"0e5483ac-647f-48f7-8da4-49bbd5831263\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " Apr 21 15:07:01.072684 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.072581 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle\") pod \"0e5483ac-647f-48f7-8da4-49bbd5831263\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " Apr 21 15:07:01.072684 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.072611 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlz9g\" (UniqueName: \"kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g\") pod \"0e5483ac-647f-48f7-8da4-49bbd5831263\" (UID: \"0e5483ac-647f-48f7-8da4-49bbd5831263\") " Apr 21 15:07:01.073021 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.072998 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle" (OuterVolumeSpecName: "bundle") pod "0e5483ac-647f-48f7-8da4-49bbd5831263" (UID: "0e5483ac-647f-48f7-8da4-49bbd5831263"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:07:01.074599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.074564 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g" (OuterVolumeSpecName: "kube-api-access-vlz9g") pod "0e5483ac-647f-48f7-8da4-49bbd5831263" (UID: "0e5483ac-647f-48f7-8da4-49bbd5831263"). InnerVolumeSpecName "kube-api-access-vlz9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:07:01.081496 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.081459 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util" (OuterVolumeSpecName: "util") pod "0e5483ac-647f-48f7-8da4-49bbd5831263" (UID: "0e5483ac-647f-48f7-8da4-49bbd5831263"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:07:01.173526 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.173476 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-util\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:07:01.173526 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.173496 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e5483ac-647f-48f7-8da4-49bbd5831263-bundle\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:07:01.173526 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.173506 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vlz9g\" (UniqueName: \"kubernetes.io/projected/0e5483ac-647f-48f7-8da4-49bbd5831263-kube-api-access-vlz9g\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:07:01.884266 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.884243 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" Apr 21 15:07:01.884266 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.884242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350d4fqc" event={"ID":"0e5483ac-647f-48f7-8da4-49bbd5831263","Type":"ContainerDied","Data":"821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd"} Apr 21 15:07:01.884410 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:01.884286 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="821dd688bd3261d9dd111a11b1827850a78daf7601d1ae4b8df5353317d077bd" Apr 21 15:07:22.137502 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137464 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137753 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="extract" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137765 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="extract" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137785 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="pull" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137790 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="pull" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137800 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="util" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137806 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="util" Apr 21 15:07:22.137883 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.137847 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e5483ac-647f-48f7-8da4-49bbd5831263" containerName="extract" Apr 21 15:07:22.139787 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.139772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:22.142892 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.142868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 15:07:22.143035 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.142868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 15:07:22.144092 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.144067 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-8dzph\"" Apr 21 15:07:22.144092 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.144081 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 15:07:22.152548 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.152527 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:07:22.237645 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.237608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qdt\" (UniqueName: \"kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt\") pod \"maas-keycloak-0\" (UID: \"28c5b114-e4f5-4788-8f1f-887fb852052d\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:22.338748 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.338716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qdt\" (UniqueName: \"kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt\") pod \"maas-keycloak-0\" (UID: \"28c5b114-e4f5-4788-8f1f-887fb852052d\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:22.350564 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.350536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qdt\" (UniqueName: \"kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt\") pod \"maas-keycloak-0\" (UID: \"28c5b114-e4f5-4788-8f1f-887fb852052d\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:22.449533 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.449456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:22.583355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.583330 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:07:22.585404 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:07:22.585376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c5b114_e4f5_4788_8f1f_887fb852052d.slice/crio-0689e990a9b6a4291da4b7ddec8b19a808a2b2b01e32acae16f9519e8c3cbc26 WatchSource:0}: Error finding container 0689e990a9b6a4291da4b7ddec8b19a808a2b2b01e32acae16f9519e8c3cbc26: Status 404 returned error can't find the container with id 0689e990a9b6a4291da4b7ddec8b19a808a2b2b01e32acae16f9519e8c3cbc26 Apr 21 15:07:22.967453 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:22.967421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"28c5b114-e4f5-4788-8f1f-887fb852052d","Type":"ContainerStarted","Data":"0689e990a9b6a4291da4b7ddec8b19a808a2b2b01e32acae16f9519e8c3cbc26"} Apr 21 15:07:27.992186 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:27.992148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"28c5b114-e4f5-4788-8f1f-887fb852052d","Type":"ContainerStarted","Data":"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4"} Apr 21 15:07:28.017483 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:28.017425 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.2426954430000001 podStartE2EDuration="6.017411214s" podCreationTimestamp="2026-04-21 15:07:22 +0000 UTC" firstStartedPulling="2026-04-21 15:07:22.586740951 +0000 UTC m=+685.440616391" lastFinishedPulling="2026-04-21 15:07:27.361456719 +0000 UTC m=+690.215332162" observedRunningTime="2026-04-21 15:07:28.017324297 +0000 UTC m=+690.871199760" watchObservedRunningTime="2026-04-21 15:07:28.017411214 +0000 UTC m=+690.871286675" Apr 21 15:07:28.449665 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:28.449558 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:28.451782 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:28.451740 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:29.450924 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:29.450860 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:30.450391 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:30.450337 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:31.450400 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:31.450342 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:32.449886 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:32.449845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:32.450850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:32.450816 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:33.450439 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:33.450387 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:34.450173 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:34.450118 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:35.449978 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:35.449919 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:36.450860 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:36.450816 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:37.450866 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:37.450812 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:38.450857 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:38.450802 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:39.450530 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:39.450483 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:40.450479 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:40.450418 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 15:07:41.580350 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:41.579988 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:41.598986 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:41.598938 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:07:51.587092 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:51.587007 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 15:07:52.856710 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:52.856674 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-787c446565-79c42"] Apr 21 15:07:52.858800 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:52.858782 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:52.862141 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:52.862114 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wqcsw\"" Apr 21 15:07:52.872956 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:52.872931 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-787c446565-79c42"] Apr 21 15:07:52.928575 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:52.928548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhtr\" (UniqueName: \"kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr\") pod \"authorino-787c446565-79c42\" (UID: \"6f54c4dd-c691-44d5-be3a-7026faf0e020\") " pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:53.027211 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.027182 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-787c446565-79c42"] Apr 21 15:07:53.027343 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:07:53.027326 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4rhtr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-787c446565-79c42" podUID="6f54c4dd-c691-44d5-be3a-7026faf0e020" Apr 21 15:07:53.029367 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.029347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhtr\" (UniqueName: \"kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr\") pod \"authorino-787c446565-79c42\" (UID: \"6f54c4dd-c691-44d5-be3a-7026faf0e020\") " pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:53.038753 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.038726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhtr\" (UniqueName: \"kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr\") pod \"authorino-787c446565-79c42\" (UID: \"6f54c4dd-c691-44d5-be3a-7026faf0e020\") " pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:53.074208 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.074184 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:07:53.076408 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.076394 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.079332 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.079313 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 15:07:53.088023 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.088006 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:07:53.097586 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.097566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:53.102614 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.102597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:53.129966 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.129895 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rhtr\" (UniqueName: \"kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr\") pod \"6f54c4dd-c691-44d5-be3a-7026faf0e020\" (UID: \"6f54c4dd-c691-44d5-be3a-7026faf0e020\") " Apr 21 15:07:53.130062 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.130047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kf24\" (UniqueName: \"kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.130114 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.130071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.131770 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.131752 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr" (OuterVolumeSpecName: "kube-api-access-4rhtr") pod "6f54c4dd-c691-44d5-be3a-7026faf0e020" (UID: "6f54c4dd-c691-44d5-be3a-7026faf0e020"). InnerVolumeSpecName "kube-api-access-4rhtr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:07:53.230491 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.230464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kf24\" (UniqueName: \"kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.230567 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.230494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.230567 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.230540 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rhtr\" (UniqueName: \"kubernetes.io/projected/6f54c4dd-c691-44d5-be3a-7026faf0e020-kube-api-access-4rhtr\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:07:53.232681 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.232658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.242293 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.242269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kf24\" (UniqueName: \"kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24\") pod \"authorino-65df47659c-j8rk4\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.384812 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.384762 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:07:53.514284 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:53.514259 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:07:53.515078 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:07:53.515051 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce027b0f_0dd7_49b0_a0af_ccfe4feb3dfe.slice/crio-4e65b49ae31d8e286a50b1a5ac52f081bc64eab5c8a4c74fdff21e9c544a61f3 WatchSource:0}: Error finding container 4e65b49ae31d8e286a50b1a5ac52f081bc64eab5c8a4c74fdff21e9c544a61f3: Status 404 returned error can't find the container with id 4e65b49ae31d8e286a50b1a5ac52f081bc64eab5c8a4c74fdff21e9c544a61f3 Apr 21 15:07:54.102748 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.102722 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-787c446565-79c42" Apr 21 15:07:54.102748 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.102718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65df47659c-j8rk4" event={"ID":"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe","Type":"ContainerStarted","Data":"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc"} Apr 21 15:07:54.103153 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.102770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65df47659c-j8rk4" event={"ID":"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe","Type":"ContainerStarted","Data":"4e65b49ae31d8e286a50b1a5ac52f081bc64eab5c8a4c74fdff21e9c544a61f3"} Apr 21 15:07:54.129923 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.127310 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-65df47659c-j8rk4" podStartSLOduration=0.717735195 podStartE2EDuration="1.12729302s" podCreationTimestamp="2026-04-21 15:07:53 +0000 UTC" firstStartedPulling="2026-04-21 15:07:53.51633529 +0000 UTC m=+716.370210730" lastFinishedPulling="2026-04-21 15:07:53.925893098 +0000 UTC m=+716.779768555" observedRunningTime="2026-04-21 15:07:54.123651359 +0000 UTC m=+716.977526833" watchObservedRunningTime="2026-04-21 15:07:54.12729302 +0000 UTC m=+716.981168495" Apr 21 15:07:54.153258 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.153219 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-787c446565-79c42"] Apr 21 15:07:54.165957 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:54.165924 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-787c446565-79c42"] Apr 21 15:07:55.547075 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.547038 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:07:55.550090 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.550074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:55.554284 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.554263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jm2k7\"" Apr 21 15:07:55.564231 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.564189 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:07:55.648513 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.648480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5nl\" (UniqueName: \"kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl\") pod \"maas-controller-c8fb586cd-79j68\" (UID: \"3de1c393-816a-4e28-b93d-42e802a11c30\") " pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:55.721263 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.721228 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f54c4dd-c691-44d5-be3a-7026faf0e020" path="/var/lib/kubelet/pods/6f54c4dd-c691-44d5-be3a-7026faf0e020/volumes" Apr 21 15:07:55.749043 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.749017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5nl\" (UniqueName: \"kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl\") pod \"maas-controller-c8fb586cd-79j68\" (UID: \"3de1c393-816a-4e28-b93d-42e802a11c30\") " pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:55.762406 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.762376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5nl\" (UniqueName: \"kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl\") pod \"maas-controller-c8fb586cd-79j68\" (UID: \"3de1c393-816a-4e28-b93d-42e802a11c30\") " pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:55.859162 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.859093 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:55.982453 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:55.982429 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:07:55.984919 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:07:55.984868 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3de1c393_816a_4e28_b93d_42e802a11c30.slice/crio-dd6d2599b46e5e155ce6b44f0e25114161182335e7c0d75157582b3db095e797 WatchSource:0}: Error finding container dd6d2599b46e5e155ce6b44f0e25114161182335e7c0d75157582b3db095e797: Status 404 returned error can't find the container with id dd6d2599b46e5e155ce6b44f0e25114161182335e7c0d75157582b3db095e797 Apr 21 15:07:56.110549 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:56.110485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c8fb586cd-79j68" event={"ID":"3de1c393-816a-4e28-b93d-42e802a11c30","Type":"ContainerStarted","Data":"dd6d2599b46e5e155ce6b44f0e25114161182335e7c0d75157582b3db095e797"} Apr 21 15:07:59.123470 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:59.123433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c8fb586cd-79j68" event={"ID":"3de1c393-816a-4e28-b93d-42e802a11c30","Type":"ContainerStarted","Data":"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f"} Apr 21 15:07:59.124080 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:59.123547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:07:59.151482 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:07:59.151421 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-c8fb586cd-79j68" podStartSLOduration=1.926373388 podStartE2EDuration="4.151404296s" podCreationTimestamp="2026-04-21 15:07:55 +0000 UTC" firstStartedPulling="2026-04-21 15:07:55.989771682 +0000 UTC m=+718.843647122" lastFinishedPulling="2026-04-21 15:07:58.214802586 +0000 UTC m=+721.068678030" observedRunningTime="2026-04-21 15:07:59.149759396 +0000 UTC m=+722.003634860" watchObservedRunningTime="2026-04-21 15:07:59.151404296 +0000 UTC m=+722.005279759" Apr 21 15:08:10.132136 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:10.132104 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:08:23.260783 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.260749 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:08:23.261167 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.260992 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-c8fb586cd-79j68" podUID="3de1c393-816a-4e28-b93d-42e802a11c30" containerName="manager" containerID="cri-o://e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f" gracePeriod=10 Apr 21 15:08:23.493301 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.493281 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:08:23.559311 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.559258 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5nl\" (UniqueName: \"kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl\") pod \"3de1c393-816a-4e28-b93d-42e802a11c30\" (UID: \"3de1c393-816a-4e28-b93d-42e802a11c30\") " Apr 21 15:08:23.561245 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.561222 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl" (OuterVolumeSpecName: "kube-api-access-pd5nl") pod "3de1c393-816a-4e28-b93d-42e802a11c30" (UID: "3de1c393-816a-4e28-b93d-42e802a11c30"). InnerVolumeSpecName "kube-api-access-pd5nl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:08:23.660504 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:23.660482 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd5nl\" (UniqueName: \"kubernetes.io/projected/3de1c393-816a-4e28-b93d-42e802a11c30-kube-api-access-pd5nl\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:08:24.216588 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.216554 2572 generic.go:358] "Generic (PLEG): container finished" podID="3de1c393-816a-4e28-b93d-42e802a11c30" containerID="e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f" exitCode=0 Apr 21 15:08:24.216756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.216617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c8fb586cd-79j68" event={"ID":"3de1c393-816a-4e28-b93d-42e802a11c30","Type":"ContainerDied","Data":"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f"} Apr 21 15:08:24.216756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.216622 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c8fb586cd-79j68" Apr 21 15:08:24.216756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.216646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c8fb586cd-79j68" event={"ID":"3de1c393-816a-4e28-b93d-42e802a11c30","Type":"ContainerDied","Data":"dd6d2599b46e5e155ce6b44f0e25114161182335e7c0d75157582b3db095e797"} Apr 21 15:08:24.216756 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.216667 2572 scope.go:117] "RemoveContainer" containerID="e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f" Apr 21 15:08:24.225109 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.225090 2572 scope.go:117] "RemoveContainer" containerID="e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f" Apr 21 15:08:24.225382 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:08:24.225363 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f\": container with ID starting with e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f not found: ID does not exist" containerID="e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f" Apr 21 15:08:24.225443 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.225393 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f"} err="failed to get container status \"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f\": rpc error: code = NotFound desc = could not find container \"e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f\": container with ID starting with e03343554cfb974c03bfe6b833f5b2651f1f6d6a6e0b2f33c8010be598ecbc3f not found: ID does not exist" Apr 21 15:08:24.238267 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.238240 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:08:24.246015 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:24.245992 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-c8fb586cd-79j68"] Apr 21 15:08:25.117249 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:25.117216 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:25.117666 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:25.117433 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" containerID="cri-o://be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4" gracePeriod=30 Apr 21 15:08:25.721393 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:25.721356 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de1c393-816a-4e28-b93d-42e802a11c30" path="/var/lib/kubelet/pods/3de1c393-816a-4e28-b93d-42e802a11c30/volumes" Apr 21 15:08:27.150731 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.150708 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.229184 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.229153 2572 generic.go:358] "Generic (PLEG): container finished" podID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerID="be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4" exitCode=143 Apr 21 15:08:27.229314 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.229210 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.229314 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.229244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"28c5b114-e4f5-4788-8f1f-887fb852052d","Type":"ContainerDied","Data":"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4"} Apr 21 15:08:27.229314 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.229288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"28c5b114-e4f5-4788-8f1f-887fb852052d","Type":"ContainerDied","Data":"0689e990a9b6a4291da4b7ddec8b19a808a2b2b01e32acae16f9519e8c3cbc26"} Apr 21 15:08:27.229314 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.229306 2572 scope.go:117] "RemoveContainer" containerID="be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4" Apr 21 15:08:27.237923 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.237893 2572 scope.go:117] "RemoveContainer" containerID="be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4" Apr 21 15:08:27.238192 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:08:27.238172 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4\": container with ID starting with be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4 not found: ID does not exist" containerID="be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4" Apr 21 15:08:27.238255 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.238198 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4"} err="failed to get container status \"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4\": rpc error: code = NotFound desc = could not find container \"be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4\": container with ID starting with be3bcdc175a407114e5fb2820841f947e9156cdbb0994f7ee37d915dfaba52b4 not found: ID does not exist" Apr 21 15:08:27.287399 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.287347 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8qdt\" (UniqueName: \"kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt\") pod \"28c5b114-e4f5-4788-8f1f-887fb852052d\" (UID: \"28c5b114-e4f5-4788-8f1f-887fb852052d\") " Apr 21 15:08:27.289295 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.289274 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt" (OuterVolumeSpecName: "kube-api-access-w8qdt") pod "28c5b114-e4f5-4788-8f1f-887fb852052d" (UID: "28c5b114-e4f5-4788-8f1f-887fb852052d"). InnerVolumeSpecName "kube-api-access-w8qdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:08:27.388699 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.388677 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8qdt\" (UniqueName: \"kubernetes.io/projected/28c5b114-e4f5-4788-8f1f-887fb852052d-kube-api-access-w8qdt\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:08:27.551790 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.551735 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:27.556024 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.556003 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:27.581793 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.581761 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:27.582108 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582090 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" Apr 21 15:08:27.582186 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582109 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" Apr 21 15:08:27.582186 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582128 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3de1c393-816a-4e28-b93d-42e802a11c30" containerName="manager" Apr 21 15:08:27.582186 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582136 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de1c393-816a-4e28-b93d-42e802a11c30" containerName="manager" Apr 21 15:08:27.582344 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582222 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3de1c393-816a-4e28-b93d-42e802a11c30" containerName="manager" Apr 21 15:08:27.582344 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.582236 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" containerName="keycloak" Apr 21 15:08:27.586575 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.586558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.589318 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.589294 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 15:08:27.589423 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.589297 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 15:08:27.589423 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.589416 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 15:08:27.589607 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.589589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-8dzph\"" Apr 21 15:08:27.589690 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.589667 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 15:08:27.594855 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.594833 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:27.689874 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.689850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/a8044788-5efa-448e-9fde-ff191dd649f4-test-realms\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.690013 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.689887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgvs\" (UniqueName: \"kubernetes.io/projected/a8044788-5efa-448e-9fde-ff191dd649f4-kube-api-access-fvgvs\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.721827 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.721793 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c5b114-e4f5-4788-8f1f-887fb852052d" path="/var/lib/kubelet/pods/28c5b114-e4f5-4788-8f1f-887fb852052d/volumes" Apr 21 15:08:27.790825 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.790803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/a8044788-5efa-448e-9fde-ff191dd649f4-test-realms\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.790952 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.790831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgvs\" (UniqueName: \"kubernetes.io/projected/a8044788-5efa-448e-9fde-ff191dd649f4-kube-api-access-fvgvs\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.791516 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.791496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/a8044788-5efa-448e-9fde-ff191dd649f4-test-realms\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.799581 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.799560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgvs\" (UniqueName: \"kubernetes.io/projected/a8044788-5efa-448e-9fde-ff191dd649f4-kube-api-access-fvgvs\") pod \"maas-keycloak-0\" (UID: \"a8044788-5efa-448e-9fde-ff191dd649f4\") " pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:27.896682 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:27.896630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:28.026196 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:28.026170 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 15:08:28.028413 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:08:28.028386 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8044788_5efa_448e_9fde_ff191dd649f4.slice/crio-4fd6ebdb32b9b39af4a0d5ae46a607848f51504287a97fb45a1e598dfcc5ad56 WatchSource:0}: Error finding container 4fd6ebdb32b9b39af4a0d5ae46a607848f51504287a97fb45a1e598dfcc5ad56: Status 404 returned error can't find the container with id 4fd6ebdb32b9b39af4a0d5ae46a607848f51504287a97fb45a1e598dfcc5ad56 Apr 21 15:08:28.234135 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:28.234105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"a8044788-5efa-448e-9fde-ff191dd649f4","Type":"ContainerStarted","Data":"4fd6ebdb32b9b39af4a0d5ae46a607848f51504287a97fb45a1e598dfcc5ad56"} Apr 21 15:08:29.241194 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:29.241157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"a8044788-5efa-448e-9fde-ff191dd649f4","Type":"ContainerStarted","Data":"95298ff517cda886479e12007684aceed56291f6ae847f8bc0c59f67b1e8553c"} Apr 21 15:08:29.261624 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:29.261576 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.928210421 podStartE2EDuration="2.261561934s" podCreationTimestamp="2026-04-21 15:08:27 +0000 UTC" firstStartedPulling="2026-04-21 15:08:28.03011229 +0000 UTC m=+750.883987733" lastFinishedPulling="2026-04-21 15:08:28.363463806 +0000 UTC m=+751.217339246" observedRunningTime="2026-04-21 15:08:29.259686415 +0000 UTC m=+752.113561879" watchObservedRunningTime="2026-04-21 15:08:29.261561934 +0000 UTC m=+752.115437395" Apr 21 15:08:29.897410 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:29.897364 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:29.899146 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:29.899109 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:30.897970 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:30.897889 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:31.898111 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:31.898060 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:32.898054 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:32.898001 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:33.897846 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:33.897800 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:34.897394 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:34.897347 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:35.897866 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:35.897814 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:36.898211 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:36.898157 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:37.897432 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:37.897380 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:37.897934 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:37.897873 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:38.897488 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:38.897438 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:39.897357 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:39.897312 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:40.897531 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:40.897481 2572 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.39:9000/health/started\": dial tcp 10.134.0.39:9000: connect: connection refused" Apr 21 15:08:42.032225 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:42.032182 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 15:08:42.052956 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:42.052882 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:08:45.878397 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.878357 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-8989f94bb-22sgb"] Apr 21 15:08:45.886549 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.886522 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:45.889530 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.889499 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-8989f94bb-22sgb"] Apr 21 15:08:45.889829 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.889807 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 15:08:45.889975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.889843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 15:08:45.891086 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:45.891066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xx77w\"" Apr 21 15:08:46.047389 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.047349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4a065cd6-7dd1-4bb4-8392-336bfae3d838-maas-api-tls\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.047389 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.047397 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7sb4\" (UniqueName: \"kubernetes.io/projected/4a065cd6-7dd1-4bb4-8392-336bfae3d838-kube-api-access-z7sb4\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.148244 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.148149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4a065cd6-7dd1-4bb4-8392-336bfae3d838-maas-api-tls\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.148244 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.148196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7sb4\" (UniqueName: \"kubernetes.io/projected/4a065cd6-7dd1-4bb4-8392-336bfae3d838-kube-api-access-z7sb4\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.151264 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.151233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/4a065cd6-7dd1-4bb4-8392-336bfae3d838-maas-api-tls\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.158084 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.158054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7sb4\" (UniqueName: \"kubernetes.io/projected/4a065cd6-7dd1-4bb4-8392-336bfae3d838-kube-api-access-z7sb4\") pod \"maas-api-8989f94bb-22sgb\" (UID: \"4a065cd6-7dd1-4bb4-8392-336bfae3d838\") " pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.199846 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.199798 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:46.377724 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:46.377695 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-8989f94bb-22sgb"] Apr 21 15:08:46.379570 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:08:46.379535 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a065cd6_7dd1_4bb4_8392_336bfae3d838.slice/crio-1c94a658efb3f25850bd43424a2e7f1fe5a414563ecc637f7808fd52b3f3a2ee WatchSource:0}: Error finding container 1c94a658efb3f25850bd43424a2e7f1fe5a414563ecc637f7808fd52b3f3a2ee: Status 404 returned error can't find the container with id 1c94a658efb3f25850bd43424a2e7f1fe5a414563ecc637f7808fd52b3f3a2ee Apr 21 15:08:47.324205 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:47.324148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8989f94bb-22sgb" event={"ID":"4a065cd6-7dd1-4bb4-8392-336bfae3d838","Type":"ContainerStarted","Data":"1c94a658efb3f25850bd43424a2e7f1fe5a414563ecc637f7808fd52b3f3a2ee"} Apr 21 15:08:48.332034 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:48.331988 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-8989f94bb-22sgb" event={"ID":"4a065cd6-7dd1-4bb4-8392-336bfae3d838","Type":"ContainerStarted","Data":"faf1d0b2d7caf62231ef2df455a7b569a3954a709c95b951ea8ee1d0ca194451"} Apr 21 15:08:48.332780 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:48.332754 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:08:48.353308 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:48.353240 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-8989f94bb-22sgb" podStartSLOduration=1.555557572 podStartE2EDuration="3.353220586s" podCreationTimestamp="2026-04-21 15:08:45 +0000 UTC" firstStartedPulling="2026-04-21 15:08:46.380821201 +0000 UTC m=+769.234696641" lastFinishedPulling="2026-04-21 15:08:48.178484202 +0000 UTC m=+771.032359655" observedRunningTime="2026-04-21 15:08:48.352387893 +0000 UTC m=+771.206263358" watchObservedRunningTime="2026-04-21 15:08:48.353220586 +0000 UTC m=+771.207096051" Apr 21 15:08:52.037730 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:52.037673 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="a8044788-5efa-448e-9fde-ff191dd649f4" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:08:55.354677 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:08:55.354646 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-8989f94bb-22sgb" Apr 21 15:09:02.038670 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:02.038638 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 15:09:14.339495 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.339456 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:09:14.339917 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.339680 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-65df47659c-j8rk4" podUID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" containerName="authorino" containerID="cri-o://92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc" gracePeriod=30 Apr 21 15:09:14.593675 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.593612 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:09:14.654545 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.654512 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kf24\" (UniqueName: \"kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24\") pod \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " Apr 21 15:09:14.654731 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.654608 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert\") pod \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\" (UID: \"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe\") " Apr 21 15:09:14.656598 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.656570 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24" (OuterVolumeSpecName: "kube-api-access-2kf24") pod "ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" (UID: "ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe"). InnerVolumeSpecName "kube-api-access-2kf24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:09:14.664897 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.664866 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" (UID: "ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:09:14.755108 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.755074 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kf24\" (UniqueName: \"kubernetes.io/projected/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-kube-api-access-2kf24\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:09:14.755108 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:14.755102 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe-tls-cert\") on node \"ip-10-0-134-40.ec2.internal\" DevicePath \"\"" Apr 21 15:09:15.443944 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.443889 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" containerID="92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc" exitCode=0 Apr 21 15:09:15.443944 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.443934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65df47659c-j8rk4" event={"ID":"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe","Type":"ContainerDied","Data":"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc"} Apr 21 15:09:15.444378 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.443954 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65df47659c-j8rk4" Apr 21 15:09:15.444378 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.443976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65df47659c-j8rk4" event={"ID":"ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe","Type":"ContainerDied","Data":"4e65b49ae31d8e286a50b1a5ac52f081bc64eab5c8a4c74fdff21e9c544a61f3"} Apr 21 15:09:15.444378 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.443991 2572 scope.go:117] "RemoveContainer" containerID="92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc" Apr 21 15:09:15.454490 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.454468 2572 scope.go:117] "RemoveContainer" containerID="92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc" Apr 21 15:09:15.454880 ip-10-0-134-40 kubenswrapper[2572]: E0421 15:09:15.454854 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc\": container with ID starting with 92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc not found: ID does not exist" containerID="92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc" Apr 21 15:09:15.455012 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.454893 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc"} err="failed to get container status \"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc\": rpc error: code = NotFound desc = could not find container \"92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc\": container with ID starting with 92f06ba971e23c28529634f3f64530810a4807574c9ef764470d90d187e67fdc not found: ID does not exist" Apr 21 15:09:15.474211 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.474187 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:09:15.480690 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.480671 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-65df47659c-j8rk4"] Apr 21 15:09:15.721831 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:15.721804 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" path="/var/lib/kubelet/pods/ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe/volumes" Apr 21 15:09:39.066124 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.066075 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg"] Apr 21 15:09:39.066637 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.066427 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" containerName="authorino" Apr 21 15:09:39.066637 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.066439 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" containerName="authorino" Apr 21 15:09:39.066637 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.066494 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce027b0f-0dd7-49b0-a0af-ccfe4feb3dfe" containerName="authorino" Apr 21 15:09:39.069763 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.069744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.072966 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.072941 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-qj6x9\"" Apr 21 15:09:39.074241 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.074213 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 15:09:39.074390 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.074226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 15:09:39.074390 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.074225 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 15:09:39.081441 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.081418 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg"] Apr 21 15:09:39.136205 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.136315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.136315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9555daba-3c59-412e-b7dc-2a4dd184adc6-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.136417 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srss8\" (UniqueName: \"kubernetes.io/projected/9555daba-3c59-412e-b7dc-2a4dd184adc6-kube-api-access-srss8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.136417 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.136417 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.136394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237644 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237765 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237765 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237765 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237887 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9555daba-3c59-412e-b7dc-2a4dd184adc6-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.237887 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.237825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srss8\" (UniqueName: \"kubernetes.io/projected/9555daba-3c59-412e-b7dc-2a4dd184adc6-kube-api-access-srss8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.238172 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.238146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.238249 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.238191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.238249 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.238226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.239996 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.239967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9555daba-3c59-412e-b7dc-2a4dd184adc6-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.240259 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.240240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9555daba-3c59-412e-b7dc-2a4dd184adc6-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.249619 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.249598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srss8\" (UniqueName: \"kubernetes.io/projected/9555daba-3c59-412e-b7dc-2a4dd184adc6-kube-api-access-srss8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-8stbg\" (UID: \"9555daba-3c59-412e-b7dc-2a4dd184adc6\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.382305 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.382220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:39.517936 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.517896 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg"] Apr 21 15:09:39.519847 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:09:39.519822 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9555daba_3c59_412e_b7dc_2a4dd184adc6.slice/crio-f7d5917d217fd139e0cc43feea72bc56e1266e0196dd44b28e53f56a031d4348 WatchSource:0}: Error finding container f7d5917d217fd139e0cc43feea72bc56e1266e0196dd44b28e53f56a031d4348: Status 404 returned error can't find the container with id f7d5917d217fd139e0cc43feea72bc56e1266e0196dd44b28e53f56a031d4348 Apr 21 15:09:39.534998 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:39.534971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" event={"ID":"9555daba-3c59-412e-b7dc-2a4dd184adc6","Type":"ContainerStarted","Data":"f7d5917d217fd139e0cc43feea72bc56e1266e0196dd44b28e53f56a031d4348"} Apr 21 15:09:41.959273 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:41.959238 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h"] Apr 21 15:09:41.962708 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:41.962673 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:41.965702 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:41.965677 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 15:09:41.977276 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:41.977253 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h"] Apr 21 15:09:42.060658 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.060658 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8m4\" (UniqueName: \"kubernetes.io/projected/624c4717-d415-411d-9325-6ea9b55d94c6-kube-api-access-wt8m4\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.060894 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.060894 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.060894 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.061075 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.060952 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/624c4717-d415-411d-9325-6ea9b55d94c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162063 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162119 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/624c4717-d415-411d-9325-6ea9b55d94c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162242 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162474 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8m4\" (UniqueName: \"kubernetes.io/projected/624c4717-d415-411d-9325-6ea9b55d94c6-kube-api-access-wt8m4\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162474 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162587 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.162587 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.162512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.164605 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.164572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/624c4717-d415-411d-9325-6ea9b55d94c6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.165313 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.165278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/624c4717-d415-411d-9325-6ea9b55d94c6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.171572 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.171550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8m4\" (UniqueName: \"kubernetes.io/projected/624c4717-d415-411d-9325-6ea9b55d94c6-kube-api-access-wt8m4\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h\" (UID: \"624c4717-d415-411d-9325-6ea9b55d94c6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:42.274109 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:42.274027 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:44.622520 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:09:44.622492 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod624c4717_d415_411d_9325_6ea9b55d94c6.slice/crio-4f3550427aa6fb11c6754d3461088102d1097a72797d141ec739e167de8d9c93 WatchSource:0}: Error finding container 4f3550427aa6fb11c6754d3461088102d1097a72797d141ec739e167de8d9c93: Status 404 returned error can't find the container with id 4f3550427aa6fb11c6754d3461088102d1097a72797d141ec739e167de8d9c93 Apr 21 15:09:44.623682 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:44.623654 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h"] Apr 21 15:09:45.565485 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:45.565443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" event={"ID":"9555daba-3c59-412e-b7dc-2a4dd184adc6","Type":"ContainerStarted","Data":"c1da007d9d2a0b0deb85125ba26e1c2ac1c12b5f2e6779a3397be324e3b73c5c"} Apr 21 15:09:45.566987 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:45.566962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" event={"ID":"624c4717-d415-411d-9325-6ea9b55d94c6","Type":"ContainerStarted","Data":"3fa4b04048f3bca3e31e1ebad9e295c20d660a5bde5bcf48a3b9a57b0b5266e7"} Apr 21 15:09:45.566987 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:45.566987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" event={"ID":"624c4717-d415-411d-9325-6ea9b55d94c6","Type":"ContainerStarted","Data":"4f3550427aa6fb11c6754d3461088102d1097a72797d141ec739e167de8d9c93"} Apr 21 15:09:50.588417 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:50.588378 2572 generic.go:358] "Generic (PLEG): container finished" podID="624c4717-d415-411d-9325-6ea9b55d94c6" containerID="3fa4b04048f3bca3e31e1ebad9e295c20d660a5bde5bcf48a3b9a57b0b5266e7" exitCode=0 Apr 21 15:09:50.588822 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:50.588456 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" event={"ID":"624c4717-d415-411d-9325-6ea9b55d94c6","Type":"ContainerDied","Data":"3fa4b04048f3bca3e31e1ebad9e295c20d660a5bde5bcf48a3b9a57b0b5266e7"} Apr 21 15:09:50.589983 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:50.589959 2572 generic.go:358] "Generic (PLEG): container finished" podID="9555daba-3c59-412e-b7dc-2a4dd184adc6" containerID="c1da007d9d2a0b0deb85125ba26e1c2ac1c12b5f2e6779a3397be324e3b73c5c" exitCode=0 Apr 21 15:09:50.590047 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:50.590004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" event={"ID":"9555daba-3c59-412e-b7dc-2a4dd184adc6","Type":"ContainerDied","Data":"c1da007d9d2a0b0deb85125ba26e1c2ac1c12b5f2e6779a3397be324e3b73c5c"} Apr 21 15:09:52.599447 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.599410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" event={"ID":"624c4717-d415-411d-9325-6ea9b55d94c6","Type":"ContainerStarted","Data":"9ea8437afa76e9bf03f95ccf095c988ac1a0283b515cbfa8ebfd47f21916d4cc"} Apr 21 15:09:52.599870 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.599675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:09:52.601257 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.601205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" event={"ID":"9555daba-3c59-412e-b7dc-2a4dd184adc6","Type":"ContainerStarted","Data":"4831751544d0cbffffbd1b03221b2e4b4becd2659193d8bd6ebc88aa51160569"} Apr 21 15:09:52.601409 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.601394 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:09:52.625764 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.625705 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" podStartSLOduration=10.485810398 podStartE2EDuration="11.625664403s" podCreationTimestamp="2026-04-21 15:09:41 +0000 UTC" firstStartedPulling="2026-04-21 15:09:50.589207994 +0000 UTC m=+833.443083433" lastFinishedPulling="2026-04-21 15:09:51.729061993 +0000 UTC m=+834.582937438" observedRunningTime="2026-04-21 15:09:52.622150217 +0000 UTC m=+835.476025680" watchObservedRunningTime="2026-04-21 15:09:52.625664403 +0000 UTC m=+835.479539865" Apr 21 15:09:52.645255 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:09:52.645196 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" podStartSLOduration=1.434600446 podStartE2EDuration="13.645181617s" podCreationTimestamp="2026-04-21 15:09:39 +0000 UTC" firstStartedPulling="2026-04-21 15:09:39.521886115 +0000 UTC m=+822.375761554" lastFinishedPulling="2026-04-21 15:09:51.732467285 +0000 UTC m=+834.586342725" observedRunningTime="2026-04-21 15:09:52.642886645 +0000 UTC m=+835.496762107" watchObservedRunningTime="2026-04-21 15:09:52.645181617 +0000 UTC m=+835.499057079" Apr 21 15:10:03.617626 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:03.617592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h" Apr 21 15:10:03.618592 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:03.618569 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-8stbg" Apr 21 15:10:20.784813 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.784778 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd"] Apr 21 15:10:20.788125 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.788104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.791797 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.791778 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 15:10:20.805119 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.805091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd"] Apr 21 15:10:20.884617 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.884799 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.884799 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.884799 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpnp8\" (UniqueName: \"kubernetes.io/projected/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kube-api-access-lpnp8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.884799 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.884983 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.884828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985386 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpnp8\" (UniqueName: \"kubernetes.io/projected/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kube-api-access-lpnp8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985599 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985873 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985873 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.985873 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.985865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.987850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.987826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.987983 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.987965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:20.996561 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:20.996541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpnp8\" (UniqueName: \"kubernetes.io/projected/b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12-kube-api-access-lpnp8\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd\" (UID: \"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:21.098310 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:21.098218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:21.231562 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:21.231534 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd"] Apr 21 15:10:21.233103 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:10:21.233066 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0c9e2ab_cbb3_4c93_8ec7_2de15c79cf12.slice/crio-fe9ef08f792a89545d20e301152a0e8fc29e75d4683b16a14552dd2312073eda WatchSource:0}: Error finding container fe9ef08f792a89545d20e301152a0e8fc29e75d4683b16a14552dd2312073eda: Status 404 returned error can't find the container with id fe9ef08f792a89545d20e301152a0e8fc29e75d4683b16a14552dd2312073eda Apr 21 15:10:21.707975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:21.707935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" event={"ID":"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12","Type":"ContainerStarted","Data":"3d4d8dd1d8eb4fe5e2dcad70e3b25ef550cdbe603fdfee63ca420ccb0b7b34b1"} Apr 21 15:10:21.707975 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:21.707975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" event={"ID":"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12","Type":"ContainerStarted","Data":"fe9ef08f792a89545d20e301152a0e8fc29e75d4683b16a14552dd2312073eda"} Apr 21 15:10:26.726671 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:26.726640 2572 generic.go:358] "Generic (PLEG): container finished" podID="b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12" containerID="3d4d8dd1d8eb4fe5e2dcad70e3b25ef550cdbe603fdfee63ca420ccb0b7b34b1" exitCode=0 Apr 21 15:10:26.727092 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:26.726718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" event={"ID":"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12","Type":"ContainerDied","Data":"3d4d8dd1d8eb4fe5e2dcad70e3b25ef550cdbe603fdfee63ca420ccb0b7b34b1"} Apr 21 15:10:27.731492 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:27.731453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" event={"ID":"b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12","Type":"ContainerStarted","Data":"6c7d742a1285593db6bec2bc27398d12b2bae93231e825b291667136eeaaadc5"} Apr 21 15:10:27.731850 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:27.731692 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:27.761605 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:27.761549 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" podStartSLOduration=7.506374032 podStartE2EDuration="7.761533435s" podCreationTimestamp="2026-04-21 15:10:20 +0000 UTC" firstStartedPulling="2026-04-21 15:10:26.727373885 +0000 UTC m=+869.581249325" lastFinishedPulling="2026-04-21 15:10:26.982533285 +0000 UTC m=+869.836408728" observedRunningTime="2026-04-21 15:10:27.760726235 +0000 UTC m=+870.614601695" watchObservedRunningTime="2026-04-21 15:10:27.761533435 +0000 UTC m=+870.615408896" Apr 21 15:10:36.275302 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.275267 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx"] Apr 21 15:10:36.279963 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.279941 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.284460 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.284438 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 15:10:36.302621 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.302599 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx"] Apr 21 15:10:36.408308 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.408474 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/418c65ae-cb2f-40d5-b859-a81503d6ed20-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.408474 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27m2l\" (UniqueName: \"kubernetes.io/projected/418c65ae-cb2f-40d5-b859-a81503d6ed20-kube-api-access-27m2l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.408474 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.408597 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.408597 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.408508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509508 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509508 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509740 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509740 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/418c65ae-cb2f-40d5-b859-a81503d6ed20-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509740 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27m2l\" (UniqueName: \"kubernetes.io/projected/418c65ae-cb2f-40d5-b859-a81503d6ed20-kube-api-access-27m2l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509891 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509748 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.509997 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.509972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.510083 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.510047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.510197 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.510173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.511691 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.511673 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/418c65ae-cb2f-40d5-b859-a81503d6ed20-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.512069 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.512049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/418c65ae-cb2f-40d5-b859-a81503d6ed20-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.527616 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.527563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27m2l\" (UniqueName: \"kubernetes.io/projected/418c65ae-cb2f-40d5-b859-a81503d6ed20-kube-api-access-27m2l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx\" (UID: \"418c65ae-cb2f-40d5-b859-a81503d6ed20\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.590106 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.590063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:36.723996 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:10:36.723967 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418c65ae_cb2f_40d5_b859_a81503d6ed20.slice/crio-42605a0338f0a244a77550b444294344b2c5c7bd77d0f6904571f55a610ace9a WatchSource:0}: Error finding container 42605a0338f0a244a77550b444294344b2c5c7bd77d0f6904571f55a610ace9a: Status 404 returned error can't find the container with id 42605a0338f0a244a77550b444294344b2c5c7bd77d0f6904571f55a610ace9a Apr 21 15:10:36.724949 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.724926 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx"] Apr 21 15:10:36.765875 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:36.765845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" event={"ID":"418c65ae-cb2f-40d5-b859-a81503d6ed20","Type":"ContainerStarted","Data":"42605a0338f0a244a77550b444294344b2c5c7bd77d0f6904571f55a610ace9a"} Apr 21 15:10:37.772406 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:37.772371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" event={"ID":"418c65ae-cb2f-40d5-b859-a81503d6ed20","Type":"ContainerStarted","Data":"6dc5147e4b4f4121bdd0b389f8a40acc9d399d24a06327240e04e4be1347c23a"} Apr 21 15:10:38.748624 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:38.748592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd" Apr 21 15:10:42.791559 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:42.791526 2572 generic.go:358] "Generic (PLEG): container finished" podID="418c65ae-cb2f-40d5-b859-a81503d6ed20" containerID="6dc5147e4b4f4121bdd0b389f8a40acc9d399d24a06327240e04e4be1347c23a" exitCode=0 Apr 21 15:10:42.791967 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:42.791605 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" event={"ID":"418c65ae-cb2f-40d5-b859-a81503d6ed20","Type":"ContainerDied","Data":"6dc5147e4b4f4121bdd0b389f8a40acc9d399d24a06327240e04e4be1347c23a"} Apr 21 15:10:43.797473 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:43.797438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" event={"ID":"418c65ae-cb2f-40d5-b859-a81503d6ed20","Type":"ContainerStarted","Data":"191a758c370d55d807a2348314a3de1e70770ca69681afa734b9c53543e6743e"} Apr 21 15:10:43.797921 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:43.797667 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:43.836194 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:43.836141 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" podStartSLOduration=7.512853686 podStartE2EDuration="7.836124656s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:42.792219291 +0000 UTC m=+885.646094731" lastFinishedPulling="2026-04-21 15:10:43.115490259 +0000 UTC m=+885.969365701" observedRunningTime="2026-04-21 15:10:43.835224741 +0000 UTC m=+886.689100201" watchObservedRunningTime="2026-04-21 15:10:43.836124656 +0000 UTC m=+886.690000116" Apr 21 15:10:54.814981 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:54.814879 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx" Apr 21 15:10:57.658480 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:57.658449 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:10:57.659113 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:10:57.659090 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:15:57.692227 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:15:57.692194 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:15:57.693564 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:15:57.693543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:20:57.728543 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:20:57.728511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:20:57.734606 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:20:57.734580 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:25:57.762181 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:25:57.762153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:25:57.765678 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:25:57.765649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:30:57.804105 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:30:57.804072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:30:57.807667 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:30:57.807638 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:35:57.839285 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:35:57.839252 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:35:57.842498 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:35:57.842458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rgqmx_db6c90a6-c365-45f5-bad7-00c882e79192/ovn-acl-logging/0.log" Apr 21 15:38:18.194451 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:18.194413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-8989f94bb-22sgb_4a065cd6-7dd1-4bb4-8392-336bfae3d838/maas-api/0.log" Apr 21 15:38:18.777742 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:18.777709 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-nrl6h_007c6968-5570-4f49-817c-1fc4331bf1f3/manager/0.log" Apr 21 15:38:19.619132 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.619100 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/extract/0.log" Apr 21 15:38:19.625354 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.625304 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/util/0.log" Apr 21 15:38:19.631716 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.631688 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/pull/0.log" Apr 21 15:38:19.749199 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.749171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/util/0.log" Apr 21 15:38:19.758747 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.758723 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/pull/0.log" Apr 21 15:38:19.764891 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.764862 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/extract/0.log" Apr 21 15:38:19.875857 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.875772 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/util/0.log" Apr 21 15:38:19.882715 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.882691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/pull/0.log" Apr 21 15:38:19.889635 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.889611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/extract/0.log" Apr 21 15:38:19.995791 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:19.995762 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/util/0.log" Apr 21 15:38:20.002770 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.002749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/pull/0.log" Apr 21 15:38:20.009730 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.009708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/extract/0.log" Apr 21 15:38:20.240578 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.240543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-9q69c_b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6/manager/0.log" Apr 21 15:38:20.346600 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.346572 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-dbc74_e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7/manager/0.log" Apr 21 15:38:20.458037 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.457996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-9nhbn_fc9b2af3-d5a7-4f3d-829c-842f23393991/kuadrant-console-plugin/0.log" Apr 21 15:38:20.576843 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.576752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-tkdp8_3937c90b-44ff-41e7-8184-abae38a17533/registry-server/0.log" Apr 21 15:38:20.914591 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:20.914503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-8pph7_1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae/manager/0.log" Apr 21 15:38:21.249546 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:21.249522 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fnz4fj_6838d6f4-b876-4811-bde1-5917c457710a/istio-proxy/0.log" Apr 21 15:38:22.285336 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.285303 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx_418c65ae-cb2f-40d5-b859-a81503d6ed20/main/0.log" Apr 21 15:38:22.291354 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.291324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-gmslx_418c65ae-cb2f-40d5-b859-a81503d6ed20/storage-initializer/0.log" Apr 21 15:38:22.405284 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.405248 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-8stbg_9555daba-3c59-412e-b7dc-2a4dd184adc6/main/0.log" Apr 21 15:38:22.411557 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.411535 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-8stbg_9555daba-3c59-412e-b7dc-2a4dd184adc6/storage-initializer/0.log" Apr 21 15:38:22.521331 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.521291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd_b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12/storage-initializer/0.log" Apr 21 15:38:22.528407 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.528385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccdbphd_b0c9e2ab-cbb3-4c93-8ec7-2de15c79cf12/main/0.log" Apr 21 15:38:22.640187 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.640113 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h_624c4717-d415-411d-9325-6ea9b55d94c6/storage-initializer/0.log" Apr 21 15:38:22.647053 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:22.647028 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-zr48h_624c4717-d415-411d-9325-6ea9b55d94c6/main/0.log" Apr 21 15:38:29.832425 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:29.832377 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vvmhw_d8fc930a-a19b-433f-8c27-8eb6887b0e8e/global-pull-secret-syncer/0.log" Apr 21 15:38:29.945573 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:29.945536 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zfc5h_b27340dd-c76e-48e4-a58b-6826530d3e1d/konnectivity-agent/0.log" Apr 21 15:38:30.018426 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:30.018396 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-40.ec2.internal_9d94cd17bd83c799f493062012f2d96c/haproxy/0.log" Apr 21 15:38:34.039353 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.039319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/extract/0.log" Apr 21 15:38:34.066508 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.066480 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/util/0.log" Apr 21 15:38:34.090625 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.090594 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tvjdm_4862ae9d-8ac4-44ef-b3c2-cb0e986e2e15/pull/0.log" Apr 21 15:38:34.120129 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.120102 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/extract/0.log" Apr 21 15:38:34.151786 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.151755 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/util/0.log" Apr 21 15:38:34.178419 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.178387 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0svgcn_c1291a95-eb90-40f2-b6be-eff3e3d9b3ef/pull/0.log" Apr 21 15:38:34.206159 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.206127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/extract/0.log" Apr 21 15:38:34.231138 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.231105 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/util/0.log" Apr 21 15:38:34.257690 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.257662 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xgntm_b8bf2bb0-cff5-47e5-87d7-ac222bcf6281/pull/0.log" Apr 21 15:38:34.287041 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.287009 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/extract/0.log" Apr 21 15:38:34.331569 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.331486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/util/0.log" Apr 21 15:38:34.361283 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.361256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1grnx4_a1298291-6cb9-4bc9-86f1-429f59568a03/pull/0.log" Apr 21 15:38:34.706717 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.706686 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-9q69c_b2c0319d-728e-4fe5-ac74-4bb6f0c1f9c6/manager/0.log" Apr 21 15:38:34.731112 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.731083 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-dbc74_e147e374-9b6a-4c0c-8db9-fbe6ea3f1ea7/manager/0.log" Apr 21 15:38:34.756692 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.756661 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-9nhbn_fc9b2af3-d5a7-4f3d-829c-842f23393991/kuadrant-console-plugin/0.log" Apr 21 15:38:34.796429 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.796397 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-tkdp8_3937c90b-44ff-41e7-8184-abae38a17533/registry-server/0.log" Apr 21 15:38:34.966805 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:34.966718 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-8pph7_1a5a9844-0a16-41c9-9ff1-9c2a8bc5ebae/manager/0.log" Apr 21 15:38:36.776121 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:36.776094 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wvgjh_c4a393e9-c391-463f-ae8b-618b766b8ca3/node-exporter/0.log" Apr 21 15:38:36.798094 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:36.798066 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wvgjh_c4a393e9-c391-463f-ae8b-618b766b8ca3/kube-rbac-proxy/0.log" Apr 21 15:38:36.822752 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:36.822724 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wvgjh_c4a393e9-c391-463f-ae8b-618b766b8ca3/init-textfile/0.log" Apr 21 15:38:38.360588 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.360554 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd"] Apr 21 15:38:38.363959 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.363935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.367267 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.367240 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"openshift-service-ca.crt\"" Apr 21 15:38:38.367438 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.367354 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b6hvv\"/\"kube-root-ca.crt\"" Apr 21 15:38:38.367515 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.367458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b6hvv\"/\"default-dockercfg-kmhbt\"" Apr 21 15:38:38.376489 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.376459 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd"] Apr 21 15:38:38.462610 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.462572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-sys\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.462610 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.462611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-lib-modules\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.462832 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.462633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-proc\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.462832 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.462676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx575\" (UniqueName: \"kubernetes.io/projected/5d215ff4-a28a-409c-8429-623b4c4cac6d-kube-api-access-cx575\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.462832 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.462741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-podres\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563157 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-podres\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-sys\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-lib-modules\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-proc\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx575\" (UniqueName: \"kubernetes.io/projected/5d215ff4-a28a-409c-8429-623b4c4cac6d-kube-api-access-cx575\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-sys\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-podres\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-proc\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.563355 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.563347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d215ff4-a28a-409c-8429-623b4c4cac6d-lib-modules\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.573024 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.572988 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx575\" (UniqueName: \"kubernetes.io/projected/5d215ff4-a28a-409c-8429-623b4c4cac6d-kube-api-access-cx575\") pod \"perf-node-gather-daemonset-gqffd\" (UID: \"5d215ff4-a28a-409c-8429-623b4c4cac6d\") " pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.674980 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.674870 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:38.807586 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.807553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd"] Apr 21 15:38:38.809528 ip-10-0-134-40 kubenswrapper[2572]: W0421 15:38:38.809500 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5d215ff4_a28a_409c_8429_623b4c4cac6d.slice/crio-8e2e6f92d78bc68d03d94d1c4d07b0fbe0c8b663573979e0a971c02c457417cd WatchSource:0}: Error finding container 8e2e6f92d78bc68d03d94d1c4d07b0fbe0c8b663573979e0a971c02c457417cd: Status 404 returned error can't find the container with id 8e2e6f92d78bc68d03d94d1c4d07b0fbe0c8b663573979e0a971c02c457417cd Apr 21 15:38:38.811480 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:38.811463 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:38:39.183696 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:39.183658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" event={"ID":"5d215ff4-a28a-409c-8429-623b4c4cac6d","Type":"ContainerStarted","Data":"b538f056c320c4077f2e7e151cc3d89638924b1b8ff979e33cc2d492dce5f404"} Apr 21 15:38:39.183696 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:39.183698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" event={"ID":"5d215ff4-a28a-409c-8429-623b4c4cac6d","Type":"ContainerStarted","Data":"8e2e6f92d78bc68d03d94d1c4d07b0fbe0c8b663573979e0a971c02c457417cd"} Apr 21 15:38:39.183942 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:39.183810 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:39.204159 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:39.204105 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" podStartSLOduration=1.204088144 podStartE2EDuration="1.204088144s" podCreationTimestamp="2026-04-21 15:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:38:39.202961779 +0000 UTC m=+2562.056837243" watchObservedRunningTime="2026-04-21 15:38:39.204088144 +0000 UTC m=+2562.057963607" Apr 21 15:38:40.942349 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:40.942320 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v55rn_f3617c41-91e1-4dea-bc4c-4a975db40cbd/dns/0.log" Apr 21 15:38:40.964102 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:40.964074 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v55rn_f3617c41-91e1-4dea-bc4c-4a975db40cbd/kube-rbac-proxy/0.log" Apr 21 15:38:41.035777 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:41.035745 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52rdt_31f48be8-c9fa-4f17-9944-60b8aaace332/dns-node-resolver/0.log" Apr 21 15:38:41.589686 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:41.589649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-677bfb6859-72cm4_7f4f8693-25c3-43cb-be49-ff52f766df6f/registry/0.log" Apr 21 15:38:41.643858 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:41.643826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-csr2d_bfd7a190-efd2-4b62-9acb-5f68c16053f5/node-ca/0.log" Apr 21 15:38:42.501123 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:42.501092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fnz4fj_6838d6f4-b876-4811-bde1-5917c457710a/istio-proxy/0.log" Apr 21 15:38:43.374064 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:43.374037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vr92l_35ffa47c-97fe-49fe-a050-659e851233d4/serve-healthcheck-canary/0.log" Apr 21 15:38:43.937775 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:43.937749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jbhm_50db6618-1e4c-443b-bb09-94f4961f7983/kube-rbac-proxy/0.log" Apr 21 15:38:43.976008 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:43.975977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jbhm_50db6618-1e4c-443b-bb09-94f4961f7983/exporter/0.log" Apr 21 15:38:44.052513 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:44.052474 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2jbhm_50db6618-1e4c-443b-bb09-94f4961f7983/extractor/0.log" Apr 21 15:38:45.199994 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:45.199970 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b6hvv/perf-node-gather-daemonset-gqffd" Apr 21 15:38:46.932315 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:46.932278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-8989f94bb-22sgb_4a065cd6-7dd1-4bb4-8392-336bfae3d838/maas-api/0.log" Apr 21 15:38:47.388279 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:47.388197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cfc874c8f-nrl6h_007c6968-5570-4f49-817c-1fc4331bf1f3/manager/0.log" Apr 21 15:38:49.351546 ip-10-0-134-40 kubenswrapper[2572]: I0421 15:38:49.351513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-586c4cccd6-vlql9_05ac9066-ab4c-4d3b-9a2c-ba6e873e0793/manager/0.log"