Apr 22 13:18:43.494306 ip-10-0-142-133 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 13:18:43.494317 ip-10-0-142-133 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 13:18:43.494323 ip-10-0-142-133 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 13:18:43.494542 ip-10-0-142-133 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 13:18:54.820868 ip-10-0-142-133 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 13:18:54.820887 ip-10-0-142-133 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 57a69bafc9b74a81a301cb1630cb1a34 -- Apr 22 13:21:23.220527 ip-10-0-142-133 systemd[1]: Starting Kubernetes Kubelet... Apr 22 13:21:23.684273 ip-10-0-142-133 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:21:23.684273 ip-10-0-142-133 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 13:21:23.684273 ip-10-0-142-133 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:21:23.684273 ip-10-0-142-133 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 13:21:23.684273 ip-10-0-142-133 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:21:23.687698 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.687605 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 13:21:23.691020 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.690995 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:21:23.691020 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691016 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:21:23.691020 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691019 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:21:23.691020 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691024 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:21:23.691020 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691028 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691032 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691035 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691038 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691041 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691044 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691046 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691049 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691052 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691054 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691057 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691059 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691062 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691065 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691068 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691072 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691076 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691079 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691082 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:21:23.691245 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691085 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691088 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691090 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691093 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691096 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691106 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691109 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691111 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691114 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691117 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691119 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691121 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691124 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691126 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691129 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691133 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691136 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691139 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691142 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:21:23.691702 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691144 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691147 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691150 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691152 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691155 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691157 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691160 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691176 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691179 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691182 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691185 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691187 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691190 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691193 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691196 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691199 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691202 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691204 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691207 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691209 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:21:23.692160 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691212 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691215 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691218 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691221 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691224 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691226 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691229 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691232 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691235 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691238 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691241 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691243 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691246 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691249 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691252 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691255 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691264 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691268 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691272 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691276 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:21:23.692744 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691280 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691284 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691287 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691290 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691705 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691711 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691714 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691717 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691719 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691722 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691724 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691727 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691730 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691733 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691735 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691738 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691741 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691744 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691747 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691750 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:21:23.693255 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691752 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691755 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691758 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691760 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691763 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691767 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691771 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691774 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691783 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691786 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691788 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691791 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691794 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691797 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691799 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691802 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691804 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691807 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691810 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:21:23.693729 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691813 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691815 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691817 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691820 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691823 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691825 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691828 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691831 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691834 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691837 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691840 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691843 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691846 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691848 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691851 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691854 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691856 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691859 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691861 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:21:23.694232 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691863 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691866 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691868 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691878 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691881 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691884 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691886 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691889 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691891 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691894 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691896 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691900 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691902 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691904 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691907 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691909 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691912 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691914 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691917 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691920 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:21:23.694710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691923 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691925 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691929 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691931 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691934 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691937 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691939 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691941 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691944 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691947 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691949 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.691952 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693473 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693484 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693494 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693499 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693511 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693514 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693519 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693526 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693529 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693532 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 13:21:23.695218 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693536 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693539 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693543 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693546 2567 flags.go:64] FLAG: --cgroup-root="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693549 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693552 2567 flags.go:64] FLAG: --client-ca-file="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693555 2567 flags.go:64] FLAG: --cloud-config="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693558 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693561 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693568 2567 flags.go:64] FLAG: --cluster-domain="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693571 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693574 2567 flags.go:64] FLAG: --config-dir="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693577 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693581 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693585 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693587 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693591 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693594 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693598 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693601 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693604 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693607 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693610 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693615 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693618 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 13:21:23.695761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693621 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693624 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693633 2567 flags.go:64] FLAG: --enable-server="true" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693636 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693643 2567 flags.go:64] FLAG: --event-burst="100" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693646 2567 flags.go:64] FLAG: --event-qps="50" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693649 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693652 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693655 2567 flags.go:64] FLAG: --eviction-hard="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693659 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693662 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693665 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693668 2567 flags.go:64] FLAG: --eviction-soft="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693671 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693674 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693677 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693680 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693683 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693686 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693690 2567 flags.go:64] FLAG: --feature-gates="" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693694 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693697 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693700 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693704 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693707 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 22 13:21:23.696375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693710 2567 flags.go:64] FLAG: --help="false" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693713 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-142-133.ec2.internal" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693716 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693719 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693722 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693725 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693728 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693732 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693735 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693738 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693746 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693750 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693753 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693756 2567 flags.go:64] FLAG: --kube-reserved="" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693759 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693762 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693765 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693768 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693771 2567 flags.go:64] FLAG: --lock-file="" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693774 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693777 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693780 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693785 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 13:21:23.696999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693788 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693791 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693794 2567 flags.go:64] FLAG: --logging-format="text" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693798 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693801 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693804 2567 flags.go:64] FLAG: --manifest-url="" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693807 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693812 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693815 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693819 2567 flags.go:64] FLAG: --max-pods="110" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693822 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693825 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693828 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693831 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693834 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693837 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693840 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693848 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693852 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693855 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693864 2567 flags.go:64] FLAG: --pod-cidr="" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693867 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693873 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693876 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 13:21:23.697558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693879 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693882 2567 flags.go:64] FLAG: --port="10250" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693885 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693888 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0adaf6df9045c8939" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693892 2567 flags.go:64] FLAG: --qos-reserved="" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693895 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693898 2567 flags.go:64] FLAG: --register-node="true" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693901 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693904 2567 flags.go:64] FLAG: --register-with-taints="" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693908 2567 flags.go:64] FLAG: --registry-burst="10" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693911 2567 flags.go:64] FLAG: --registry-qps="5" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693914 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693918 2567 flags.go:64] FLAG: --reserved-memory="" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693922 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693925 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693928 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693930 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693933 2567 flags.go:64] FLAG: --runonce="false" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693936 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693939 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693942 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693945 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693947 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693950 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693954 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693957 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 13:21:23.698194 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693963 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693966 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693969 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693977 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693981 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693984 2567 flags.go:64] FLAG: --system-cgroups="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693986 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.693997 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694000 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694003 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694011 2567 flags.go:64] FLAG: --tls-min-version="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694014 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694017 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694020 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694023 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694026 2567 flags.go:64] FLAG: --v="2" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694031 2567 flags.go:64] FLAG: --version="false" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694035 2567 flags.go:64] FLAG: --vmodule="" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694040 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694043 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694149 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694153 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694156 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694159 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:21:23.698857 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694174 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694177 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694181 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694185 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694188 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694191 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694194 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694197 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694199 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694202 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694205 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694207 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694211 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694219 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694222 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694225 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694227 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694230 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694233 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694236 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:21:23.699496 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694239 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694242 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694245 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694248 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694251 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694253 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694256 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694259 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694261 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694264 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694266 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694269 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694271 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694274 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694277 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694279 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694281 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694284 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694286 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694289 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:21:23.700016 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694291 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694294 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694297 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694299 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694302 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694306 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694309 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694311 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694314 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694316 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694319 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694321 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694324 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694326 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694329 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694332 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694334 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694337 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694339 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694342 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:21:23.700536 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694344 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694347 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694349 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694352 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694355 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694357 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694361 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694364 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694367 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694370 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694373 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694375 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694378 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694380 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694383 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694385 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694388 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694391 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694396 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:21:23.701024 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694398 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:21:23.701506 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694401 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:21:23.701506 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.694403 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:21:23.701506 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.694415 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:21:23.702849 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.702826 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 13:21:23.702884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.702850 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 13:21:23.702918 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702901 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:21:23.702918 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702906 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:21:23.702918 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702910 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:21:23.702918 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702914 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:21:23.702918 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702919 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702922 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702925 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702928 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702931 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702934 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702936 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702939 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702941 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702944 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702946 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702949 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702951 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702954 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702957 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702960 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702962 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702965 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702967 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702970 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:21:23.703046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702973 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702976 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702979 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702981 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702984 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702987 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702990 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.702998 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703001 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703004 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703006 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703009 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703011 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703014 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703017 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703019 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703022 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703024 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703027 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703030 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:21:23.703548 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703032 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703035 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703037 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703040 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703043 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703045 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703048 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703051 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703054 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703057 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703060 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703063 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703065 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703068 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703070 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703073 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703076 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703078 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703082 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703084 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:21:23.704072 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703093 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703096 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703099 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703102 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703104 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703107 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703109 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703112 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703115 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703117 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703120 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703123 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703127 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703130 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703133 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703135 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703138 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703141 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703143 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:21:23.704580 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703146 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703148 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703151 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.703157 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703274 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703280 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703285 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703289 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703292 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703295 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703298 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703301 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703304 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703307 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703311 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:21:23.705050 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703314 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703317 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703320 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703323 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703326 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703329 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703332 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703335 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703338 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703340 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703343 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703345 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703348 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703351 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703353 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703356 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703358 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703361 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703364 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703366 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:21:23.705445 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703369 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703371 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703373 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703376 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703378 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703381 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703383 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703386 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703388 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703391 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703393 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703396 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703400 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703403 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703405 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703408 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703411 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703413 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703415 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703419 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:21:23.705922 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703421 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703424 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703426 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703428 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703431 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703433 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703436 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703438 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703441 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703443 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703446 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703449 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703451 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703454 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703456 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703459 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703461 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703464 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703466 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703469 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:21:23.706512 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703472 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703474 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703477 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703479 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703482 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703485 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703488 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703491 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703493 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703495 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703498 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703501 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703503 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703506 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:23.703508 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:21:23.707029 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.703513 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:21:23.707417 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.704251 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 13:21:23.708094 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.708080 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 13:21:23.709224 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.709213 2567 server.go:1019] "Starting client certificate rotation" Apr 22 13:21:23.709326 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.709310 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 13:21:23.709356 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.709346 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 13:21:23.737651 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.737630 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 13:21:23.741786 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.741768 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 13:21:23.757240 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.757217 2567 log.go:25] "Validated CRI v1 runtime API" Apr 22 13:21:23.762899 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.762878 2567 log.go:25] "Validated CRI v1 image API" Apr 22 13:21:23.764524 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.764507 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 13:21:23.765404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.765385 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 13:21:23.767563 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.767544 2567 fs.go:135] Filesystem UUIDs: map[5dd91a5c-e25b-49e3-9364-37541dbe1272:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b5e2ec60-322b-46e6-8de2-304053dfbb41:/dev/nvme0n1p4] Apr 22 13:21:23.767636 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.767563 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 13:21:23.773220 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773102 2567 manager.go:217] Machine: {Timestamp:2026-04-22 13:21:23.771800838 +0000 UTC m=+0.424904613 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3083755 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bdb3ba332d3f354128965a8557f7a SystemUUID:ec2bdb3b-a332-d3f3-5412-8965a8557f7a BootID:57a69baf-c9b7-4a81-a301-cb1630cb1a34 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9d:f8:8e:d5:53 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9d:f8:8e:d5:53 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:78:6e:af:26:ad Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 13:21:23.773220 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773215 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 13:21:23.773315 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773299 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 13:21:23.773701 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773676 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 13:21:23.773842 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773702 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-133.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 13:21:23.773884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773852 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 13:21:23.773884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773861 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 13:21:23.773884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773878 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 13:21:23.773966 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.773894 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 13:21:23.775198 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.775188 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 22 13:21:23.775313 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.775304 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 13:21:23.777673 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.777663 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 22 13:21:23.777707 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.777678 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 13:21:23.778373 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.778364 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 13:21:23.778413 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.778376 2567 kubelet.go:397] "Adding apiserver pod source" Apr 22 13:21:23.778413 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.778385 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 13:21:23.779629 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.779613 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 13:21:23.779629 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.779631 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 13:21:23.782964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.782940 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 13:21:23.784061 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.784040 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v797k" Apr 22 13:21:23.784210 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.784196 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 13:21:23.785858 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785847 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785864 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785870 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785876 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785882 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785887 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785892 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785898 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785906 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 13:21:23.785912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785912 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 13:21:23.786141 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785920 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 13:21:23.786141 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.785929 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 13:21:23.786927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.786918 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 13:21:23.786927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.786926 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 13:21:23.790598 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.790583 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 13:21:23.790667 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.790620 2567 server.go:1295] "Started kubelet" Apr 22 13:21:23.790714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.790686 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 13:21:23.790825 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.790784 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 13:21:23.790879 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.790849 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 13:21:23.791574 ip-10-0-142-133 systemd[1]: Started Kubernetes Kubelet. Apr 22 13:21:23.794391 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.794363 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v797k" Apr 22 13:21:23.794487 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.794393 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 13:21:23.796459 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.796433 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 13:21:23.796551 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.796493 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-133.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 13:21:23.796613 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.796593 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 22 13:21:23.796654 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.796607 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 13:21:23.800200 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.800160 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 13:21:23.800705 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.800689 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 13:21:23.801346 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.801324 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 13:21:23.801423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.801381 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 13:21:23.801423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.801398 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 13:21:23.801541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.801519 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 22 13:21:23.801541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.801534 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 22 13:21:23.802055 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.802037 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 13:21:23.802237 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.802228 2567 factory.go:55] Registering systemd factory Apr 22 13:21:23.802849 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.802828 2567 factory.go:223] Registration of the systemd container factory successfully Apr 22 13:21:23.802923 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.802359 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:23.803704 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.803314 2567 factory.go:153] Registering CRI-O factory Apr 22 13:21:23.803704 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.803329 2567 factory.go:223] Registration of the crio container factory successfully Apr 22 13:21:23.803704 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.803347 2567 factory.go:103] Registering Raw factory Apr 22 13:21:23.803704 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.803364 2567 manager.go:1196] Started watching for new ooms in manager Apr 22 13:21:23.804078 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.804063 2567 manager.go:319] Starting recovery of all containers Apr 22 13:21:23.804749 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.804730 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 13:21:23.809323 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.809299 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:23.814073 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.813921 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-133.ec2.internal\" not found" node="ip-10-0-142-133.ec2.internal" Apr 22 13:21:23.814142 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.813995 2567 manager.go:324] Recovery completed Apr 22 13:21:23.818195 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.818181 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:23.820453 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820438 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:23.820517 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820463 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:23.820517 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820475 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:23.820961 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820940 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 13:21:23.820961 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820957 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 13:21:23.821053 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.820973 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 22 13:21:23.823976 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.823965 2567 policy_none.go:49] "None policy: Start" Apr 22 13:21:23.824027 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.823980 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 13:21:23.824027 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.823989 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 22 13:21:23.864834 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.864809 2567 manager.go:341] "Starting Device Plugin manager" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.864845 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.864855 2567 server.go:85] "Starting device plugin registration server" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.865086 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.865099 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.865290 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.865372 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.865380 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.865966 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 13:21:23.889001 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.866013 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:23.941722 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.941647 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 13:21:23.942861 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.942846 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 13:21:23.942927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.942874 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 13:21:23.942927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.942893 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 13:21:23.942927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.942900 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 13:21:23.943056 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.942935 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 13:21:23.946238 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.946218 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:23.965938 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.965917 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:23.966818 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.966794 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:23.966921 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.966826 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:23.966921 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.966837 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:23.966921 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.966860 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-133.ec2.internal" Apr 22 13:21:23.975223 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:23.975209 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-133.ec2.internal" Apr 22 13:21:23.975268 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.975231 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-133.ec2.internal\": node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:23.988368 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:23.988349 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.043433 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.043384 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal"] Apr 22 13:21:24.043538 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.043490 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:24.044880 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.044860 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:24.044968 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.044894 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:24.044968 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.044911 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:24.047296 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.047283 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:24.047442 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.047426 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.047495 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.047464 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:24.048103 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048087 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:24.048197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048114 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:24.048197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048125 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:24.048197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048093 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:24.048197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048184 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:24.048197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.048196 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:24.050357 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.050342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.050442 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.050371 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:21:24.051067 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.051051 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:21:24.051137 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.051081 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:21:24.051137 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.051092 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:21:24.064766 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.064744 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-133.ec2.internal\" not found" node="ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.068748 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.068730 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-133.ec2.internal\" not found" node="ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.088965 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.088938 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.103451 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.103428 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.103541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.103452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.103541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.103475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e2e036ce9e751626ebdb05be6b1bff59-config\") pod \"kube-apiserver-proxy-ip-10-0-142-133.ec2.internal\" (UID: \"e2e036ce9e751626ebdb05be6b1bff59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.189434 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.189391 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.203829 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203769 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.203829 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.203829 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e2e036ce9e751626ebdb05be6b1bff59-config\") pod \"kube-apiserver-proxy-ip-10-0-142-133.ec2.internal\" (UID: \"e2e036ce9e751626ebdb05be6b1bff59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.204007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203857 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e2e036ce9e751626ebdb05be6b1bff59-config\") pod \"kube-apiserver-proxy-ip-10-0-142-133.ec2.internal\" (UID: \"e2e036ce9e751626ebdb05be6b1bff59\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.204007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.204007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.203871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073188d8ef5c4ac245f88f75f2b768bb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal\" (UID: \"073188d8ef5c4ac245f88f75f2b768bb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.290198 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.290148 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.366712 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.366688 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.371087 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.371062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:24.390553 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.390515 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.491130 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.491036 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.591630 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.591599 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.692075 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.692045 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.708565 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.708549 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 13:21:24.708686 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.708669 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 13:21:24.708725 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.708707 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 13:21:24.792512 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.792479 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.796677 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.796634 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 13:16:23 +0000 UTC" deadline="2027-12-21 03:48:35.792289687 +0000 UTC" Apr 22 13:21:24.796791 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.796678 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14582h27m10.995616494s" Apr 22 13:21:24.800312 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.800291 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 13:21:24.814503 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.814481 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 13:21:24.833365 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.833337 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjk9m" Apr 22 13:21:24.841160 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.841139 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cjk9m" Apr 22 13:21:24.893394 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:24.893370 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-133.ec2.internal\" not found" Apr 22 13:21:24.901034 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:24.901001 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e036ce9e751626ebdb05be6b1bff59.slice/crio-c89647c1de96e138279c495a74715cd0eccb0bf0cd246b183611559d66cb1909 WatchSource:0}: Error finding container c89647c1de96e138279c495a74715cd0eccb0bf0cd246b183611559d66cb1909: Status 404 returned error can't find the container with id c89647c1de96e138279c495a74715cd0eccb0bf0cd246b183611559d66cb1909 Apr 22 13:21:24.901459 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:24.901436 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073188d8ef5c4ac245f88f75f2b768bb.slice/crio-867cdd14a9193cd2798cdc112112b6c4ab57db7975fb83bbcf9b52dd0dba7531 WatchSource:0}: Error finding container 867cdd14a9193cd2798cdc112112b6c4ab57db7975fb83bbcf9b52dd0dba7531: Status 404 returned error can't find the container with id 867cdd14a9193cd2798cdc112112b6c4ab57db7975fb83bbcf9b52dd0dba7531 Apr 22 13:21:24.905289 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.905274 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:21:24.946049 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.946003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" event={"ID":"073188d8ef5c4ac245f88f75f2b768bb","Type":"ContainerStarted","Data":"867cdd14a9193cd2798cdc112112b6c4ab57db7975fb83bbcf9b52dd0dba7531"} Apr 22 13:21:24.946948 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.946919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" event={"ID":"e2e036ce9e751626ebdb05be6b1bff59","Type":"ContainerStarted","Data":"c89647c1de96e138279c495a74715cd0eccb0bf0cd246b183611559d66cb1909"} Apr 22 13:21:24.962320 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:24.962301 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:25.001919 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.001893 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" Apr 22 13:21:25.012369 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.012347 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 13:21:25.014338 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.014323 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" Apr 22 13:21:25.022231 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.022215 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 13:21:25.037334 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.037270 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:25.725601 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.725572 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:25.779496 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.779467 2567 apiserver.go:52] "Watching apiserver" Apr 22 13:21:25.789223 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.789195 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 13:21:25.792015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.791986 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-59t8w","openshift-ovn-kubernetes/ovnkube-node-n7z2m","kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p","openshift-image-registry/node-ca-jgtzf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal","openshift-multus/multus-additional-cni-plugins-97vcn","openshift-multus/multus-k8gsc","openshift-multus/network-metrics-daemon-kbkn6","openshift-network-diagnostics/network-check-target-jklhk","kube-system/konnectivity-agent-xfrxg","openshift-cluster-node-tuning-operator/tuned-nxh6k"] Apr 22 13:21:25.797198 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.797150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.799341 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.799308 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 13:21:25.799464 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.799319 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.799464 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.799413 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jgsg7\"" Apr 22 13:21:25.799738 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.799721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.799947 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.799925 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.800024 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.800005 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.801639 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.801620 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-64hkt\"" Apr 22 13:21:25.801897 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.801872 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 13:21:25.802003 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.801949 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.802003 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.801880 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.802105 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.802048 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.802353 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.802315 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.802640 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.802619 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.802739 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.802707 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xpx5q\"" Apr 22 13:21:25.802805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.802745 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 13:21:25.804712 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.804545 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 13:21:25.804712 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.804605 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 13:21:25.804712 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.804545 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 13:21:25.805205 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.804761 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-m7dq6\"" Apr 22 13:21:25.805205 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.804885 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.805205 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.805068 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.805205 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.805132 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 13:21:25.808529 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.808508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.808936 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.808699 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.810389 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810370 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 13:21:25.810480 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810374 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.810673 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810637 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 13:21:25.810673 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810654 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 13:21:25.810770 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.810770 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810640 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ks7c6\"" Apr 22 13:21:25.810911 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810895 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:25.811043 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.810993 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 13:21:25.811043 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.811008 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dqq7h\"" Apr 22 13:21:25.811043 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:25.811004 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:25.812436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812386 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-var-lib-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-etc-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-ovn\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-systemd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-config\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-sys-fs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-kubelet\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-systemd-units\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-node-log\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-log-socket\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-env-overrides\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.812805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vrm\" (UniqueName: \"kubernetes.io/projected/8c3da84b-ac7e-4b33-9e51-bda762f06468-kube-api-access-z7vrm\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.812805 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.812712 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d87803a-e452-4d88-ab3e-466482b69647-iptables-alerter-script\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.813285 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813260 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8k5v\" (UniqueName: \"kubernetes.io/projected/1d87803a-e452-4d88-ab3e-466482b69647-kube-api-access-r8k5v\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.813367 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-bin\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.813367 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-netd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.813481 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovn-node-metrics-cert\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.813481 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c3da84b-ac7e-4b33-9e51-bda762f06468-host\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.813605 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.813657 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-socket-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.813657 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-device-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.813754 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7hs\" (UniqueName: \"kubernetes.io/projected/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kube-api-access-nn7hs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.813754 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-script-lib\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.813853 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.813968 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c3da84b-ac7e-4b33-9e51-bda762f06468-serviceca\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814003 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-netns\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-slash\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.814203 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814127 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjc6\" (UniqueName: \"kubernetes.io/projected/474b725c-806b-45d7-b14a-c4e4d0f026cd-kube-api-access-mtjc6\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.814578 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-registration-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.814578 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.814289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d87803a-e452-4d88-ab3e-466482b69647-host-slash\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.817209 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.817188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:25.817292 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.817251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:25.817292 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:25.817279 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:25.819344 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.819322 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 13:21:25.819436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.819376 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 13:21:25.819436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.819394 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-r7jpz\"" Apr 22 13:21:25.819679 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.819665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.821687 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.821669 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tk9vn\"" Apr 22 13:21:25.821761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.821709 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:25.821761 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.821721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:25.842322 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.842293 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 13:16:24 +0000 UTC" deadline="2027-09-20 08:43:42.269051754 +0000 UTC" Apr 22 13:21:25.842322 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.842322 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12379h22m16.42673323s" Apr 22 13:21:25.902808 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.902782 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 13:21:25.914789 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-system-cni-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xb8\" (UniqueName: \"kubernetes.io/projected/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-kube-api-access-45xb8\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-sys-fs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-kubelet\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-systemd-units\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-log-socket\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-env-overrides\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.914959 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-kubelet\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-sys-fs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-systemd-units\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.914991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-log-socket\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915010 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjxvs\" (UniqueName: \"kubernetes.io/projected/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-kube-api-access-gjxvs\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915046 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-conf\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-var-lib-kubelet\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915126 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vrm\" (UniqueName: \"kubernetes.io/projected/8c3da84b-ac7e-4b33-9e51-bda762f06468-kube-api-access-z7vrm\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d87803a-e452-4d88-ab3e-466482b69647-iptables-alerter-script\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8k5v\" (UniqueName: \"kubernetes.io/projected/1d87803a-e452-4d88-ab3e-466482b69647-kube-api-access-r8k5v\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915242 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.915333 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysconfig\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.915870 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915344 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-daemon-config\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.915870 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-host\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.915870 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915412 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.915870 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-env-overrides\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.915870 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1d87803a-e452-4d88-ab3e-466482b69647-iptables-alerter-script\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.916077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.916077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.915980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-cnibin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-netns\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916046 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgp4n\" (UniqueName: \"kubernetes.io/projected/220972eb-8140-47aa-bef6-2fc6a45d677d-kube-api-access-xgp4n\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916098 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-device-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916133 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-device-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7hs\" (UniqueName: \"kubernetes.io/projected/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kube-api-access-nn7hs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916193 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-os-release\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.916314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916310 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-etc-kubernetes\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-sys\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916446 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-multus-certs\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjc6\" (UniqueName: \"kubernetes.io/projected/474b725c-806b-45d7-b14a-c4e4d0f026cd-kube-api-access-mtjc6\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916569 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cnibin\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-conf-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916675 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-registration-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-etc-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-ovn\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916743 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-registration-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-kubernetes\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-kubelet\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-etc-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916810 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-ovn\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.916840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-systemd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-config\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916905 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-systemd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-tuned\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916960 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-tmp\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.916975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drlc\" (UniqueName: \"kubernetes.io/projected/225bdcdd-db73-48cb-b288-6a1d2423f8df-kube-api-access-7drlc\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-multus\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-node-log\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-lib-modules\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917102 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-cni-binary-copy\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917118 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-node-log\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/105c8108-1f35-4d1b-8170-b9f59625a7c3-konnectivity-ca\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-bin\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-netd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917234 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovn-node-metrics-cert\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917258 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-run\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.917474 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917258 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-bin\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917589 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-cni-netd\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917772 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-config\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-os-release\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/105c8108-1f35-4d1b-8170-b9f59625a7c3-agent-certs\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c3da84b-ac7e-4b33-9e51-bda762f06468-host\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917927 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c3da84b-ac7e-4b33-9e51-bda762f06468-host\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917950 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.917998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918035 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-socket-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-script-lib\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918105 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918116 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-modprobe-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.918147 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c3da84b-ac7e-4b33-9e51-bda762f06468-serviceca\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-netns\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-socket-dir\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-system-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918288 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-k8s-cni-cncf-io\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-bin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918364 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-hostroot\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-slash\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918479 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d87803a-e452-4d88-ab3e-466482b69647-host-slash\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-var-lib-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d87803a-e452-4d88-ab3e-466482b69647-host-slash\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-run-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.918710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.918656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-systemd\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:25.919308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovnkube-script-lib\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.919308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-slash\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.919308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-socket-dir-parent\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:25.920446 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-host-run-netns\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.920446 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474b725c-806b-45d7-b14a-c4e4d0f026cd-var-lib-openvswitch\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.920446 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.919689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c3da84b-ac7e-4b33-9e51-bda762f06468-serviceca\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:25.922532 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.922499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474b725c-806b-45d7-b14a-c4e4d0f026cd-ovn-node-metrics-cert\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.925812 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.925787 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjc6\" (UniqueName: \"kubernetes.io/projected/474b725c-806b-45d7-b14a-c4e4d0f026cd-kube-api-access-mtjc6\") pod \"ovnkube-node-n7z2m\" (UID: \"474b725c-806b-45d7-b14a-c4e4d0f026cd\") " pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:25.925933 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.925820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7hs\" (UniqueName: \"kubernetes.io/projected/5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4-kube-api-access-nn7hs\") pod \"aws-ebs-csi-driver-node-ms86p\" (UID: \"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:25.925993 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.925950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8k5v\" (UniqueName: \"kubernetes.io/projected/1d87803a-e452-4d88-ab3e-466482b69647-kube-api-access-r8k5v\") pod \"iptables-alerter-59t8w\" (UID: \"1d87803a-e452-4d88-ab3e-466482b69647\") " pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:25.926064 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:25.926039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vrm\" (UniqueName: \"kubernetes.io/projected/8c3da84b-ac7e-4b33-9e51-bda762f06468-kube-api-access-z7vrm\") pod \"node-ca-jgtzf\" (UID: \"8c3da84b-ac7e-4b33-9e51-bda762f06468\") " pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:26.019917 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.019883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-host\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020068 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.019930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020068 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-host\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020068 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-cnibin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020086 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020087 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-netns\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020124 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-netns\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgp4n\" (UniqueName: \"kubernetes.io/projected/220972eb-8140-47aa-bef6-2fc6a45d677d-kube-api-access-xgp4n\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-os-release\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-cnibin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020235 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-etc-kubernetes\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020250 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-etc-kubernetes\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-sys\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-os-release\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-multus-certs\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-sys\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cnibin\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020379 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cnibin\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-multus-certs\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-conf-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-conf-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020501 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-kubernetes\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020520 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-kubelet\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.020542 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-kubelet\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-kubernetes\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.020624 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:26.520588062 +0000 UTC m=+3.173691823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-tuned\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020681 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-tmp\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7drlc\" (UniqueName: \"kubernetes.io/projected/225bdcdd-db73-48cb-b288-6a1d2423f8df-kube-api-access-7drlc\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020732 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-multus\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020762 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-lib-modules\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-cni-binary-copy\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020812 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/105c8108-1f35-4d1b-8170-b9f59625a7c3-konnectivity-ca\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-run\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-os-release\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-multus\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/105c8108-1f35-4d1b-8170-b9f59625a7c3-agent-certs\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:26.020945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.020995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-modprobe-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-system-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021041 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-os-release\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-k8s-cni-cncf-io\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-bin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-run\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021107 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-hostroot\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021122 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-lib-modules\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021225 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-systemd\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-socket-dir-parent\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021292 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-system-cni-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45xb8\" (UniqueName: \"kubernetes.io/projected/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-kube-api-access-45xb8\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjxvs\" (UniqueName: \"kubernetes.io/projected/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-kube-api-access-gjxvs\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-conf\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.021747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-var-lib-kubelet\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/105c8108-1f35-4d1b-8170-b9f59625a7c3-konnectivity-ca\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysconfig\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-daemon-config\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-systemd\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-modprobe-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-cni-binary-copy\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-system-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-hostroot\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-d\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-run-k8s-cni-cncf-io\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-host-var-lib-cni-bin\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021870 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-system-cni-dir\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysctl-conf\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021893 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-var-lib-kubelet\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022405 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-socket-dir-parent\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.021955 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-cni-dir\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.022007 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-sysconfig\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.022964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.022040 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/220972eb-8140-47aa-bef6-2fc6a45d677d-multus-daemon-config\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.022964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.022189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.022964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.022386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.023515 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.023490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-tmp\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.023741 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.023722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/225bdcdd-db73-48cb-b288-6a1d2423f8df-etc-tuned\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.023923 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.023898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/105c8108-1f35-4d1b-8170-b9f59625a7c3-agent-certs\") pod \"konnectivity-agent-xfrxg\" (UID: \"105c8108-1f35-4d1b-8170-b9f59625a7c3\") " pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:26.026522 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.026497 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:26.026589 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.026530 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:26.026589 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.026543 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:26.026677 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.026611 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:26.526594746 +0000 UTC m=+3.179698522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:26.028763 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.028717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgp4n\" (UniqueName: \"kubernetes.io/projected/220972eb-8140-47aa-bef6-2fc6a45d677d-kube-api-access-xgp4n\") pod \"multus-k8gsc\" (UID: \"220972eb-8140-47aa-bef6-2fc6a45d677d\") " pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.029189 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.029155 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjxvs\" (UniqueName: \"kubernetes.io/projected/52ffca56-a3d8-49c4-b537-b0fc46ac5d2c-kube-api-access-gjxvs\") pod \"multus-additional-cni-plugins-97vcn\" (UID: \"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c\") " pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.029308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.029214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drlc\" (UniqueName: \"kubernetes.io/projected/225bdcdd-db73-48cb-b288-6a1d2423f8df-kube-api-access-7drlc\") pod \"tuned-nxh6k\" (UID: \"225bdcdd-db73-48cb-b288-6a1d2423f8df\") " pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.030287 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.030262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xb8\" (UniqueName: \"kubernetes.io/projected/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-kube-api-access-45xb8\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:26.108714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.108679 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" Apr 22 13:21:26.117377 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.117353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jgtzf" Apr 22 13:21:26.126092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.126072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59t8w" Apr 22 13:21:26.130818 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.130799 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:26.137402 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.137381 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-97vcn" Apr 22 13:21:26.145015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.144996 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k8gsc" Apr 22 13:21:26.150614 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.150586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:26.156197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.156180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" Apr 22 13:21:26.260459 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.260425 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:21:26.519291 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.519266 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225bdcdd_db73_48cb_b288_6a1d2423f8df.slice/crio-b15ecd444cf91a1139f18bb563ee8a09b068b2a9cd172461eb3d72fb66a497fd WatchSource:0}: Error finding container b15ecd444cf91a1139f18bb563ee8a09b068b2a9cd172461eb3d72fb66a497fd: Status 404 returned error can't find the container with id b15ecd444cf91a1139f18bb563ee8a09b068b2a9cd172461eb3d72fb66a497fd Apr 22 13:21:26.520392 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.520367 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474b725c_806b_45d7_b14a_c4e4d0f026cd.slice/crio-ce197b5cfd5f9fdeaaedf6def5fb14f7abc0e12fc564516a2dbdb65dc600e7be WatchSource:0}: Error finding container ce197b5cfd5f9fdeaaedf6def5fb14f7abc0e12fc564516a2dbdb65dc600e7be: Status 404 returned error can't find the container with id ce197b5cfd5f9fdeaaedf6def5fb14f7abc0e12fc564516a2dbdb65dc600e7be Apr 22 13:21:26.520854 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.520830 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105c8108_1f35_4d1b_8170_b9f59625a7c3.slice/crio-6f6f4f5441fc773fee69238aeea6182bfbc31cfa00b4db22ba741abdcd8992fb WatchSource:0}: Error finding container 6f6f4f5441fc773fee69238aeea6182bfbc31cfa00b4db22ba741abdcd8992fb: Status 404 returned error can't find the container with id 6f6f4f5441fc773fee69238aeea6182bfbc31cfa00b4db22ba741abdcd8992fb Apr 22 13:21:26.522257 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.522156 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c3da84b_ac7e_4b33_9e51_bda762f06468.slice/crio-2c4c0c3ccc0306f5088d3ffd585f9839c15da21209312944c419d707c4bae1ba WatchSource:0}: Error finding container 2c4c0c3ccc0306f5088d3ffd585f9839c15da21209312944c419d707c4bae1ba: Status 404 returned error can't find the container with id 2c4c0c3ccc0306f5088d3ffd585f9839c15da21209312944c419d707c4bae1ba Apr 22 13:21:26.524998 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.524973 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ffca56_a3d8_49c4_b537_b0fc46ac5d2c.slice/crio-8783204ed853f30a20f2ed77f56fc6fcaf39b6c44ffde11ce923c625f0bbc8d8 WatchSource:0}: Error finding container 8783204ed853f30a20f2ed77f56fc6fcaf39b6c44ffde11ce923c625f0bbc8d8: Status 404 returned error can't find the container with id 8783204ed853f30a20f2ed77f56fc6fcaf39b6c44ffde11ce923c625f0bbc8d8 Apr 22 13:21:26.525094 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.525022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:26.525250 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.525158 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:26.525311 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.525269 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.525250168 +0000 UTC m=+4.178353930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:26.528026 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.528002 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d87803a_e452_4d88_ab3e_466482b69647.slice/crio-9c3463dba66b3d52c32a1713e0c2ea416bc0ef3d7b658979fd69e0bc78b00a05 WatchSource:0}: Error finding container 9c3463dba66b3d52c32a1713e0c2ea416bc0ef3d7b658979fd69e0bc78b00a05: Status 404 returned error can't find the container with id 9c3463dba66b3d52c32a1713e0c2ea416bc0ef3d7b658979fd69e0bc78b00a05 Apr 22 13:21:26.548889 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.548861 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aad924b_2ce8_4efb_a7b1_7b9e7e09e7f4.slice/crio-bca9e8e01f25f92950463b657d90da87a5fee8dc77d107cb5205deab003b26da WatchSource:0}: Error finding container bca9e8e01f25f92950463b657d90da87a5fee8dc77d107cb5205deab003b26da: Status 404 returned error can't find the container with id bca9e8e01f25f92950463b657d90da87a5fee8dc77d107cb5205deab003b26da Apr 22 13:21:26.549582 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:21:26.549554 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220972eb_8140_47aa_bef6_2fc6a45d677d.slice/crio-715e8191901131a4290dca38e6d9a06f10c9b29f834d7a8d3e29370da7e2b9f7 WatchSource:0}: Error finding container 715e8191901131a4290dca38e6d9a06f10c9b29f834d7a8d3e29370da7e2b9f7: Status 404 returned error can't find the container with id 715e8191901131a4290dca38e6d9a06f10c9b29f834d7a8d3e29370da7e2b9f7 Apr 22 13:21:26.626563 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.626357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:26.626631 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.626486 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:26.626631 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.626621 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:26.626715 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.626632 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:26.626715 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.626677 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.626662634 +0000 UTC m=+4.279766398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:26.842758 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.842718 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 13:16:24 +0000 UTC" deadline="2027-11-12 14:04:05.445677449 +0000 UTC" Apr 22 13:21:26.842758 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.842753 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13656h42m38.602927064s" Apr 22 13:21:26.943963 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.943857 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:26.944214 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:26.944020 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:26.959090 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.959014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" event={"ID":"225bdcdd-db73-48cb-b288-6a1d2423f8df","Type":"ContainerStarted","Data":"b15ecd444cf91a1139f18bb563ee8a09b068b2a9cd172461eb3d72fb66a497fd"} Apr 22 13:21:26.963246 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.963214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" event={"ID":"e2e036ce9e751626ebdb05be6b1bff59","Type":"ContainerStarted","Data":"f6a13ad4760b1e044e2e256fa38783648df6ca8f0ef6029a890bd98ce15bd6bd"} Apr 22 13:21:26.970001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.969969 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" event={"ID":"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4","Type":"ContainerStarted","Data":"bca9e8e01f25f92950463b657d90da87a5fee8dc77d107cb5205deab003b26da"} Apr 22 13:21:26.988088 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.986381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59t8w" event={"ID":"1d87803a-e452-4d88-ab3e-466482b69647","Type":"ContainerStarted","Data":"9c3463dba66b3d52c32a1713e0c2ea416bc0ef3d7b658979fd69e0bc78b00a05"} Apr 22 13:21:26.989834 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.989755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerStarted","Data":"8783204ed853f30a20f2ed77f56fc6fcaf39b6c44ffde11ce923c625f0bbc8d8"} Apr 22 13:21:26.993278 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:26.993091 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jgtzf" event={"ID":"8c3da84b-ac7e-4b33-9e51-bda762f06468","Type":"ContainerStarted","Data":"2c4c0c3ccc0306f5088d3ffd585f9839c15da21209312944c419d707c4bae1ba"} Apr 22 13:21:27.001747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.001700 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8gsc" event={"ID":"220972eb-8140-47aa-bef6-2fc6a45d677d","Type":"ContainerStarted","Data":"715e8191901131a4290dca38e6d9a06f10c9b29f834d7a8d3e29370da7e2b9f7"} Apr 22 13:21:27.010929 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.007702 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xfrxg" event={"ID":"105c8108-1f35-4d1b-8170-b9f59625a7c3","Type":"ContainerStarted","Data":"6f6f4f5441fc773fee69238aeea6182bfbc31cfa00b4db22ba741abdcd8992fb"} Apr 22 13:21:27.010929 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.009432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"ce197b5cfd5f9fdeaaedf6def5fb14f7abc0e12fc564516a2dbdb65dc600e7be"} Apr 22 13:21:27.531105 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.531026 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-133.ec2.internal" podStartSLOduration=2.531006691 podStartE2EDuration="2.531006691s" podCreationTimestamp="2026-04-22 13:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:26.986266404 +0000 UTC m=+3.639370187" watchObservedRunningTime="2026-04-22 13:21:27.531006691 +0000 UTC m=+4.184110464" Apr 22 13:21:27.531877 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.531848 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pjr6r"] Apr 22 13:21:27.533921 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.533896 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:27.534078 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.534055 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:27.534149 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.534131 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:29.53411214 +0000 UTC m=+6.187215900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:27.538044 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.534644 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.538044 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.536894 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.538044 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.537115 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bwhcj\"" Apr 22 13:21:27.538044 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.537370 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.634309 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.634274 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f59a955-0700-417a-a991-8b127c7e9438-hosts-file\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.634488 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.634342 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:27.634488 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.634375 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgtw4\" (UniqueName: \"kubernetes.io/projected/1f59a955-0700-417a-a991-8b127c7e9438-kube-api-access-fgtw4\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.634488 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.634403 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f59a955-0700-417a-a991-8b127c7e9438-tmp-dir\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.634655 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.634575 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:27.634655 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.634597 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:27.634655 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.634609 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:27.634782 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.634666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:29.634646182 +0000 UTC m=+6.287749955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:27.735205 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.735154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgtw4\" (UniqueName: \"kubernetes.io/projected/1f59a955-0700-417a-a991-8b127c7e9438-kube-api-access-fgtw4\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.735389 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.735221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f59a955-0700-417a-a991-8b127c7e9438-tmp-dir\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.735389 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.735276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f59a955-0700-417a-a991-8b127c7e9438-hosts-file\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.735389 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.735378 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f59a955-0700-417a-a991-8b127c7e9438-hosts-file\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.736012 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.735989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f59a955-0700-417a-a991-8b127c7e9438-tmp-dir\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.750969 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.747542 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgtw4\" (UniqueName: \"kubernetes.io/projected/1f59a955-0700-417a-a991-8b127c7e9438-kube-api-access-fgtw4\") pod \"node-resolver-pjr6r\" (UID: \"1f59a955-0700-417a-a991-8b127c7e9438\") " pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.849029 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.848551 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pjr6r" Apr 22 13:21:27.944942 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:27.944393 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:27.944942 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:27.944528 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:28.022759 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:28.022723 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pjr6r" event={"ID":"1f59a955-0700-417a-a991-8b127c7e9438","Type":"ContainerStarted","Data":"d28a4335502dfdf21a2ed2a9fcb4494a9d6e0300b8474b91116215b10751b661"} Apr 22 13:21:28.029109 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:28.028591 2567 generic.go:358] "Generic (PLEG): container finished" podID="073188d8ef5c4ac245f88f75f2b768bb" containerID="c5e6d9f2c5e351e330e78fd832a1314014c6ae297c9848ce72d31d04d345b2f7" exitCode=0 Apr 22 13:21:28.029109 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:28.028703 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" event={"ID":"073188d8ef5c4ac245f88f75f2b768bb","Type":"ContainerDied","Data":"c5e6d9f2c5e351e330e78fd832a1314014c6ae297c9848ce72d31d04d345b2f7"} Apr 22 13:21:28.943369 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:28.943337 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:28.943827 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:28.943474 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:29.044318 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:29.044281 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" event={"ID":"073188d8ef5c4ac245f88f75f2b768bb","Type":"ContainerStarted","Data":"e0db4cdf34547f201933b400d369d8ca66cf52f013d36a9a1a3574ec1f090b12"} Apr 22 13:21:29.552101 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:29.552067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:29.552293 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.552244 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:29.552363 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.552303 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:33.552284785 +0000 UTC m=+10.205388544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:29.652939 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:29.652901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:29.653096 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.653053 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:29.653096 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.653076 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:29.653096 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.653088 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:29.653286 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.653151 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:33.653130163 +0000 UTC m=+10.306233940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:29.943600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:29.943309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:29.943600 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:29.943485 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:30.943564 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:30.943520 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:30.943741 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:30.943704 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:31.943210 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:31.943180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:31.943377 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:31.943285 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:32.943461 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:32.943425 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:32.943903 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:32.943566 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:33.588316 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:33.588261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:33.588519 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.588432 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:33.588587 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.588523 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:41.588499978 +0000 UTC m=+18.241603751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:33.689404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:33.689157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:33.689404 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.689332 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:33.689404 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.689360 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:33.689404 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.689375 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:33.689734 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.689442 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:41.689422974 +0000 UTC m=+18.342526747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:33.944455 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:33.944373 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:33.944875 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:33.944495 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:34.944216 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:34.944018 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:34.944216 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:34.944160 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:35.943563 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:35.943525 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:35.943945 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:35.943669 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:36.943627 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:36.943590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:36.944139 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:36.943726 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:37.943467 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:37.943438 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:37.943656 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:37.943572 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:38.943778 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:38.943735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:38.944225 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:38.943845 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:39.944059 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:39.944029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:39.944553 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:39.944184 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:40.943131 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:40.943103 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:40.943411 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:40.943235 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:41.646525 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:41.646471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:41.647047 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.646628 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:41.647047 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.646683 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:57.646664807 +0000 UTC m=+34.299768565 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:41.746833 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:41.746798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:41.747121 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.746957 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:41.747121 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.746973 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:41.747121 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.746982 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:41.747121 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.747045 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:57.747026189 +0000 UTC m=+34.400129958 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:41.944121 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:41.944015 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:41.944281 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:41.944201 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:42.943843 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:42.943813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:42.944271 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:42.943937 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:43.946503 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:43.946074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:43.946503 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:43.946445 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:44.071578 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.071536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" event={"ID":"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4","Type":"ContainerStarted","Data":"15ed567d58875e902bc9b69aef91d17798bc72af19b5124b468d772d04110895"} Apr 22 13:21:44.073181 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.073134 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="e7df249b5e908a1df3905197ba951211f9eb72096aa56fe76b3086e5372d8585" exitCode=0 Apr 22 13:21:44.073316 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.073206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"e7df249b5e908a1df3905197ba951211f9eb72096aa56fe76b3086e5372d8585"} Apr 22 13:21:44.074769 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.074699 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jgtzf" event={"ID":"8c3da84b-ac7e-4b33-9e51-bda762f06468","Type":"ContainerStarted","Data":"128c8cd72f063a3f651e57e0ac299c70fcef0cf0af39427bd91a6fe03b828a13"} Apr 22 13:21:44.076124 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.076058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pjr6r" event={"ID":"1f59a955-0700-417a-a991-8b127c7e9438","Type":"ContainerStarted","Data":"05b71d5c3c63b2e941953a76a1dd77d356c6c441702e70aba4aaa564ac73b856"} Apr 22 13:21:44.077411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.077387 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k8gsc" event={"ID":"220972eb-8140-47aa-bef6-2fc6a45d677d","Type":"ContainerStarted","Data":"320929423b134dc46908d3b18b80574858e4e90e7057814fae27e4a270e5f7c8"} Apr 22 13:21:44.078939 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.078921 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xfrxg" event={"ID":"105c8108-1f35-4d1b-8170-b9f59625a7c3","Type":"ContainerStarted","Data":"1d3d4ae509ca3d0c9a37251238936e4538cd37274c5f2b1f683e2c09ef90839b"} Apr 22 13:21:44.081480 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081460 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:21:44.081765 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081744 2567 generic.go:358] "Generic (PLEG): container finished" podID="474b725c-806b-45d7-b14a-c4e4d0f026cd" containerID="7eef8246211325a0b2cd0fe062763f1d78f06807ce2af6c0ec828e0308bc3d19" exitCode=1 Apr 22 13:21:44.081846 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"dda75e06ab987220dcbf29fb700502b72f1ed1e1620171d17165aca69dd5252e"} Apr 22 13:21:44.081846 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"6a45f6da038ae7418539f7082fe32c87417a5211bb099df2b7319d814ad3eea6"} Apr 22 13:21:44.081941 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081847 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"21a618ad8aaed0f95a8363167a82529d708de003e1cc40e417eb53b1ef7ab23a"} Apr 22 13:21:44.081941 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"b71a78e1fe53d694bd03325c7d2151125c0f309bd0032779db7ae77a25b5d5dc"} Apr 22 13:21:44.081941 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081872 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerDied","Data":"7eef8246211325a0b2cd0fe062763f1d78f06807ce2af6c0ec828e0308bc3d19"} Apr 22 13:21:44.081941 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.081886 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"2960b31fed4cd7d78746c1dfd7bab2c2bb97583cb22cea5ddce04027a83a06d7"} Apr 22 13:21:44.082993 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.082973 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" event={"ID":"225bdcdd-db73-48cb-b288-6a1d2423f8df","Type":"ContainerStarted","Data":"307683c4a2f6de38d16111b76f6ff4f9c5034dd7a8c24472e39fbc313bde6569"} Apr 22 13:21:44.096441 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.096393 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-133.ec2.internal" podStartSLOduration=19.096377507 podStartE2EDuration="19.096377507s" podCreationTimestamp="2026-04-22 13:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:29.05821897 +0000 UTC m=+5.711322752" watchObservedRunningTime="2026-04-22 13:21:44.096377507 +0000 UTC m=+20.749481289" Apr 22 13:21:44.114470 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.114414 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k8gsc" podStartSLOduration=3.487726135 podStartE2EDuration="20.114394244s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.551770107 +0000 UTC m=+3.204873866" lastFinishedPulling="2026-04-22 13:21:43.178438215 +0000 UTC m=+19.831541975" observedRunningTime="2026-04-22 13:21:44.114186727 +0000 UTC m=+20.767290503" watchObservedRunningTime="2026-04-22 13:21:44.114394244 +0000 UTC m=+20.767498047" Apr 22 13:21:44.128662 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.128615 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xfrxg" podStartSLOduration=3.5139670130000003 podStartE2EDuration="20.128598887s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.526710654 +0000 UTC m=+3.179814414" lastFinishedPulling="2026-04-22 13:21:43.141342526 +0000 UTC m=+19.794446288" observedRunningTime="2026-04-22 13:21:44.128441537 +0000 UTC m=+20.781545319" watchObservedRunningTime="2026-04-22 13:21:44.128598887 +0000 UTC m=+20.781702669" Apr 22 13:21:44.145098 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.145050 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nxh6k" podStartSLOduration=3.265540704 podStartE2EDuration="20.145035263s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.52162384 +0000 UTC m=+3.174727599" lastFinishedPulling="2026-04-22 13:21:43.401118383 +0000 UTC m=+20.054222158" observedRunningTime="2026-04-22 13:21:44.144392673 +0000 UTC m=+20.797496453" watchObservedRunningTime="2026-04-22 13:21:44.145035263 +0000 UTC m=+20.798139044" Apr 22 13:21:44.158597 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.158552 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pjr6r" podStartSLOduration=1.8960572949999999 podStartE2EDuration="17.158535808s" podCreationTimestamp="2026-04-22 13:21:27 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.87881976 +0000 UTC m=+4.531923524" lastFinishedPulling="2026-04-22 13:21:43.141298264 +0000 UTC m=+19.794402037" observedRunningTime="2026-04-22 13:21:44.15811397 +0000 UTC m=+20.811217751" watchObservedRunningTime="2026-04-22 13:21:44.158535808 +0000 UTC m=+20.811639594" Apr 22 13:21:44.334415 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.334338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pjr6r_1f59a955-0700-417a-a991-8b127c7e9438/dns-node-resolver/0.log" Apr 22 13:21:44.615806 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.615781 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 13:21:44.876866 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.876710 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T13:21:44.615800131Z","UUID":"6f6eadda-45ed-49ce-a599-6bbcdf2a1c27","Handler":null,"Name":"","Endpoint":""} Apr 22 13:21:44.880106 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.880081 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 13:21:44.880247 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.880115 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 13:21:44.943568 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:44.943539 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:44.943728 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:44.943659 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:45.087246 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.087210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" event={"ID":"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4","Type":"ContainerStarted","Data":"9c9350313f28a52a6cdcefe6d4948b1001454c0d36c32ce852b686ec34ad847d"} Apr 22 13:21:45.089751 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.089696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59t8w" event={"ID":"1d87803a-e452-4d88-ab3e-466482b69647","Type":"ContainerStarted","Data":"a9630b891094823b1df8afbef7f285620b3085266fc3b197d238a37b3fc07b63"} Apr 22 13:21:45.107853 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.107799 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-59t8w" podStartSLOduration=4.513615901 podStartE2EDuration="21.107787696s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.547125175 +0000 UTC m=+3.200228934" lastFinishedPulling="2026-04-22 13:21:43.141296953 +0000 UTC m=+19.794400729" observedRunningTime="2026-04-22 13:21:45.107552051 +0000 UTC m=+21.760655833" watchObservedRunningTime="2026-04-22 13:21:45.107787696 +0000 UTC m=+21.760891477" Apr 22 13:21:45.108145 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.108113 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jgtzf" podStartSLOduration=5.493648691 podStartE2EDuration="22.108106967s" podCreationTimestamp="2026-04-22 13:21:23 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.526845021 +0000 UTC m=+3.179948787" lastFinishedPulling="2026-04-22 13:21:43.141303293 +0000 UTC m=+19.794407063" observedRunningTime="2026-04-22 13:21:44.178529247 +0000 UTC m=+20.831633025" watchObservedRunningTime="2026-04-22 13:21:45.108106967 +0000 UTC m=+21.761210752" Apr 22 13:21:45.117751 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.117725 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jgtzf_8c3da84b-ac7e-4b33-9e51-bda762f06468/node-ca/0.log" Apr 22 13:21:45.944251 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:45.944218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:45.944429 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:45.944395 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:46.096423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:46.096203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" event={"ID":"5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4","Type":"ContainerStarted","Data":"f754a27a45760157ab5b666a97e9cd12a213291d85626ce23cb91f9c49bd947f"} Apr 22 13:21:46.114117 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:46.114027 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ms86p" podStartSLOduration=3.8030906079999998 podStartE2EDuration="23.114008821s" podCreationTimestamp="2026-04-22 13:21:23 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.551880681 +0000 UTC m=+3.204984442" lastFinishedPulling="2026-04-22 13:21:45.862798883 +0000 UTC m=+22.515902655" observedRunningTime="2026-04-22 13:21:46.113697275 +0000 UTC m=+22.766801068" watchObservedRunningTime="2026-04-22 13:21:46.114008821 +0000 UTC m=+22.767112603" Apr 22 13:21:46.943834 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:46.943796 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:46.944038 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:46.943926 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:47.101329 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:47.101305 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:21:47.101844 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:47.101661 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"0b9c356a7f43afb7c0c0c8290de86c47787b6e35efbcfd4af1a084a8a551f10b"} Apr 22 13:21:47.943693 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:47.943661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:47.943855 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:47.943811 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:48.158098 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:48.158062 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:48.158744 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:48.158726 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:48.944277 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:48.944088 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:48.944426 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:48.944345 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:49.107026 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.106993 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="771be8298777dfc75bde562fea8f5fdd12273c56d3a88f59a117a5ee875ae2f9" exitCode=0 Apr 22 13:21:49.107211 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.107081 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"771be8298777dfc75bde562fea8f5fdd12273c56d3a88f59a117a5ee875ae2f9"} Apr 22 13:21:49.110016 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.109995 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:21:49.110364 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.110333 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"667d69cf8d7c690404997e73b76a7d2e57e107ec6c5fa6054f7103cd394e0a60"} Apr 22 13:21:49.110642 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.110620 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:49.110799 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.110782 2567 scope.go:117] "RemoveContainer" containerID="7eef8246211325a0b2cd0fe062763f1d78f06807ce2af6c0ec828e0308bc3d19" Apr 22 13:21:49.111000 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.110986 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xfrxg" Apr 22 13:21:49.943850 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:49.943810 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:49.944345 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:49.943931 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:50.114614 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.114582 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:21:50.114954 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.114918 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" event={"ID":"474b725c-806b-45d7-b14a-c4e4d0f026cd","Type":"ContainerStarted","Data":"2043a62fcd0d3413ee673f56b766d401f714979e3435007e1dea230c6d287a6f"} Apr 22 13:21:50.115248 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.115230 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:50.115334 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.115260 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:50.115334 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.115274 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:50.117182 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.117135 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="f05e9ba6f2d31582d8f53bb83392d874b48c988a41040a49d7d45573e772324f" exitCode=0 Apr 22 13:21:50.117278 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.117203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"f05e9ba6f2d31582d8f53bb83392d874b48c988a41040a49d7d45573e772324f"} Apr 22 13:21:50.129977 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.129955 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:50.130088 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.130044 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:21:50.194752 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.194707 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" podStartSLOduration=9.350483456 podStartE2EDuration="26.194692659s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.526712648 +0000 UTC m=+3.179816410" lastFinishedPulling="2026-04-22 13:21:43.370921833 +0000 UTC m=+20.024025613" observedRunningTime="2026-04-22 13:21:50.15550458 +0000 UTC m=+26.808608360" watchObservedRunningTime="2026-04-22 13:21:50.194692659 +0000 UTC m=+26.847796474" Apr 22 13:21:50.943914 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:50.943827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:50.944290 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:50.943921 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:51.121348 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:51.121311 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="f0b285481fefdf83b4e282b6b6af204f979869e345cbf23eab04cf3a2d41bddd" exitCode=0 Apr 22 13:21:51.121472 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:51.121359 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"f0b285481fefdf83b4e282b6b6af204f979869e345cbf23eab04cf3a2d41bddd"} Apr 22 13:21:51.944146 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:51.944111 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:51.944541 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:51.944264 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:52.943579 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:52.943549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:52.943844 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:52.943679 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:53.944857 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:53.944822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:53.945343 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:53.944950 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:54.943586 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:54.943553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:54.943780 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:54.943662 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:55.947185 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:55.947139 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:55.947680 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:55.947276 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:56.943804 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:56.943785 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:56.943899 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:56.943880 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:57.136144 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:57.136109 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerStarted","Data":"761b3e6f63859abbaf05e182bfee5097234acc2a5fdc708fa818d772e52655dc"} Apr 22 13:21:57.667866 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:57.667825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:57.668073 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.667961 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:57.668073 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.668032 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs podName:1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8 nodeName:}" failed. No retries permitted until 2026-04-22 13:22:29.66801733 +0000 UTC m=+66.321121090 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs") pod "network-metrics-daemon-kbkn6" (UID: "1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:57.768946 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:57.768913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:57.769109 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.769042 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:57.769109 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.769057 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:57.769109 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.769067 2567 projected.go:194] Error preparing data for projected volume kube-api-access-g5blw for pod openshift-network-diagnostics/network-check-target-jklhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:57.769250 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.769121 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw podName:2887fb17-5e94-487b-9353-de32227a0d91 nodeName:}" failed. No retries permitted until 2026-04-22 13:22:29.769102973 +0000 UTC m=+66.422206733 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-g5blw" (UniqueName: "kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw") pod "network-check-target-jklhk" (UID: "2887fb17-5e94-487b-9353-de32227a0d91") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:57.946225 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:57.946138 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:57.946363 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:57.946252 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:21:58.141810 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:58.141777 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="761b3e6f63859abbaf05e182bfee5097234acc2a5fdc708fa818d772e52655dc" exitCode=0 Apr 22 13:21:58.142189 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:58.141822 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"761b3e6f63859abbaf05e182bfee5097234acc2a5fdc708fa818d772e52655dc"} Apr 22 13:21:58.943963 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:58.943936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:21:58.944126 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:58.944028 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:21:59.145384 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:59.145339 2567 generic.go:358] "Generic (PLEG): container finished" podID="52ffca56-a3d8-49c4-b537-b0fc46ac5d2c" containerID="11590480188d37b357f437ec324d8d7302efaa347f4a3027cf18c22011f4aa37" exitCode=0 Apr 22 13:21:59.145384 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:59.145388 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerDied","Data":"11590480188d37b357f437ec324d8d7302efaa347f4a3027cf18c22011f4aa37"} Apr 22 13:21:59.945686 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:21:59.945525 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:21:59.945849 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:21:59.945752 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:00.149730 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:00.149699 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-97vcn" event={"ID":"52ffca56-a3d8-49c4-b537-b0fc46ac5d2c","Type":"ContainerStarted","Data":"31a5d89f87b602e3ce188a21b449b2dae38fadb57bf7c2e6d99d5b7bf74bdca4"} Apr 22 13:22:00.171595 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:00.171549 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-97vcn" podStartSLOduration=5.785790714 podStartE2EDuration="36.171534918s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:21:26.527565787 +0000 UTC m=+3.180669551" lastFinishedPulling="2026-04-22 13:21:56.913309996 +0000 UTC m=+33.566413755" observedRunningTime="2026-04-22 13:22:00.171120507 +0000 UTC m=+36.824224288" watchObservedRunningTime="2026-04-22 13:22:00.171534918 +0000 UTC m=+36.824638699" Apr 22 13:22:00.943420 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:00.943379 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:00.943595 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:00.943515 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:01.945962 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:01.945933 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:01.946341 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:01.946035 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:02.943670 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:02.943637 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:02.943835 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:02.943731 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:03.944144 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:03.944114 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:03.944510 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:03.944218 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:04.943663 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:04.943632 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:04.943887 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:04.943734 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:05.180899 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:05.180863 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jklhk"] Apr 22 13:22:05.181619 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:05.180962 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:05.181619 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:05.181048 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:05.183839 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:05.183816 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbkn6"] Apr 22 13:22:05.183980 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:05.183923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:05.184043 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:05.184002 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:06.943125 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:06.943092 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:06.943602 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:06.943101 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:06.943602 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:06.943219 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:06.943602 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:06.943263 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:08.943988 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:08.943950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:08.944462 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:08.943950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:08.944462 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:08.944112 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jklhk" podUID="2887fb17-5e94-487b-9353-de32227a0d91" Apr 22 13:22:08.944462 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:08.944133 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbkn6" podUID="1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8" Apr 22 13:22:09.181266 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.181188 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeReady" Apr 22 13:22:09.181404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.181319 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 13:22:09.228007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.227977 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gh69k"] Apr 22 13:22:09.262632 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.262606 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bh9wx"] Apr 22 13:22:09.262782 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.262766 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.265737 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.265712 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rrhbx\"" Apr 22 13:22:09.265965 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.265946 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 13:22:09.266040 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.266023 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 13:22:09.286241 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.286058 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gh69k"] Apr 22 13:22:09.286241 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.286243 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bh9wx"] Apr 22 13:22:09.286426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.286255 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9r9bf"] Apr 22 13:22:09.286426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.286210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.288646 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.288621 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 13:22:09.288835 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.288698 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 13:22:09.288835 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.288626 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 13:22:09.289024 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.288944 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6bcgp\"" Apr 22 13:22:09.304105 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.304079 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9r9bf"] Apr 22 13:22:09.304256 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.304192 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.306710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.306687 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 13:22:09.306710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.306702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 13:22:09.306918 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.306702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 13:22:09.306918 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.306732 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 13:22:09.306918 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.306687 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jvn9v\"" Apr 22 13:22:09.362073 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.362043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/352d4472-2ff1-4835-b7bb-78277c591127-tmp-dir\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.362073 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.362074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/352d4472-2ff1-4835-b7bb-78277c591127-metrics-tls\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.362305 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.362093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvqq\" (UniqueName: \"kubernetes.io/projected/352d4472-2ff1-4835-b7bb-78277c591127-kube-api-access-4mvqq\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.362305 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.362148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/352d4472-2ff1-4835-b7bb-78277c591127-config-volume\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463049 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.462954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1257ae3-f69b-4b5c-b3ff-2400607495ed-data-volume\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.463049 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.462990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.463049 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/352d4472-2ff1-4835-b7bb-78277c591127-tmp-dir\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/352d4472-2ff1-4835-b7bb-78277c591127-metrics-tls\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvqq\" (UniqueName: \"kubernetes.io/projected/352d4472-2ff1-4835-b7bb-78277c591127-kube-api-access-4mvqq\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463206 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1257ae3-f69b-4b5c-b3ff-2400607495ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15464d00-15bf-4f8f-a47f-ffeac13f32c5-cert\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/352d4472-2ff1-4835-b7bb-78277c591127-config-volume\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrg5\" (UniqueName: \"kubernetes.io/projected/15464d00-15bf-4f8f-a47f-ffeac13f32c5-kube-api-access-5lrg5\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463338 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1257ae3-f69b-4b5c-b3ff-2400607495ed-crio-socket\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463360 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-api-access-gkv4n\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.463426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/352d4472-2ff1-4835-b7bb-78277c591127-tmp-dir\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.463865 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.463847 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/352d4472-2ff1-4835-b7bb-78277c591127-config-volume\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.467562 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.467540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/352d4472-2ff1-4835-b7bb-78277c591127-metrics-tls\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.470060 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.470040 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvqq\" (UniqueName: \"kubernetes.io/projected/352d4472-2ff1-4835-b7bb-78277c591127-kube-api-access-4mvqq\") pod \"dns-default-gh69k\" (UID: \"352d4472-2ff1-4835-b7bb-78277c591127\") " pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.564197 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-api-access-gkv4n\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1257ae3-f69b-4b5c-b3ff-2400607495ed-data-volume\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564248 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1257ae3-f69b-4b5c-b3ff-2400607495ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15464d00-15bf-4f8f-a47f-ffeac13f32c5-cert\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrg5\" (UniqueName: \"kubernetes.io/projected/15464d00-15bf-4f8f-a47f-ffeac13f32c5-kube-api-access-5lrg5\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.564404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1257ae3-f69b-4b5c-b3ff-2400607495ed-crio-socket\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564713 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1257ae3-f69b-4b5c-b3ff-2400607495ed-crio-socket\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564713 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564706 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1257ae3-f69b-4b5c-b3ff-2400607495ed-data-volume\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.564882 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.564862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.566682 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.566657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1257ae3-f69b-4b5c-b3ff-2400607495ed-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.566800 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.566769 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15464d00-15bf-4f8f-a47f-ffeac13f32c5-cert\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.571935 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.571913 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:09.572269 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.572248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv4n\" (UniqueName: \"kubernetes.io/projected/e1257ae3-f69b-4b5c-b3ff-2400607495ed-kube-api-access-gkv4n\") pod \"insights-runtime-extractor-9r9bf\" (UID: \"e1257ae3-f69b-4b5c-b3ff-2400607495ed\") " pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.572411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.572387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrg5\" (UniqueName: \"kubernetes.io/projected/15464d00-15bf-4f8f-a47f-ffeac13f32c5-kube-api-access-5lrg5\") pod \"ingress-canary-bh9wx\" (UID: \"15464d00-15bf-4f8f-a47f-ffeac13f32c5\") " pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.595039 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.595007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bh9wx" Apr 22 13:22:09.613526 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.612853 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9r9bf" Apr 22 13:22:09.740108 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.740078 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bh9wx"] Apr 22 13:22:09.746934 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.746911 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gh69k"] Apr 22 13:22:09.749264 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:09.749236 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15464d00_15bf_4f8f_a47f_ffeac13f32c5.slice/crio-1b45480bf2eddff2062f5b64905147edf39b803524f1a46f78e954b85cdd519e WatchSource:0}: Error finding container 1b45480bf2eddff2062f5b64905147edf39b803524f1a46f78e954b85cdd519e: Status 404 returned error can't find the container with id 1b45480bf2eddff2062f5b64905147edf39b803524f1a46f78e954b85cdd519e Apr 22 13:22:09.749790 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:09.749759 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod352d4472_2ff1_4835_b7bb_78277c591127.slice/crio-6c135af2ba52bacb3485e8b3f1131fe0e3d2f5b9666e3b166c42f739769531a4 WatchSource:0}: Error finding container 6c135af2ba52bacb3485e8b3f1131fe0e3d2f5b9666e3b166c42f739769531a4: Status 404 returned error can't find the container with id 6c135af2ba52bacb3485e8b3f1131fe0e3d2f5b9666e3b166c42f739769531a4 Apr 22 13:22:09.763633 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.763613 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9r9bf"] Apr 22 13:22:09.767422 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:09.767401 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1257ae3_f69b_4b5c_b3ff_2400607495ed.slice/crio-cde987378d9e66016904473c7a3bd274b6b4f3afc93416371fc6eb986f7affff WatchSource:0}: Error finding container cde987378d9e66016904473c7a3bd274b6b4f3afc93416371fc6eb986f7affff: Status 404 returned error can't find the container with id cde987378d9e66016904473c7a3bd274b6b4f3afc93416371fc6eb986f7affff Apr 22 13:22:09.821219 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.821196 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-md289"] Apr 22 13:22:09.833181 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.833145 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-md289"] Apr 22 13:22:09.833300 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.833285 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:09.835599 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835579 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 13:22:09.835717 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835629 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 13:22:09.835717 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835696 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 13:22:09.835817 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835720 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 13:22:09.835884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835868 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 13:22:09.835974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.835959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vtdfw\"" Apr 22 13:22:09.966964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.966932 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:09.966964 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.966973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg2c5\" (UniqueName: \"kubernetes.io/projected/a04764b3-e96d-4a35-8c3a-14a1c7c86599-kube-api-access-sg2c5\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:09.967397 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.967078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:09.967397 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:09.967111 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04764b3-e96d-4a35-8c3a-14a1c7c86599-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.067816 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.067787 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.067816 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.067822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg2c5\" (UniqueName: \"kubernetes.io/projected/a04764b3-e96d-4a35-8c3a-14a1c7c86599-kube-api-access-sg2c5\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.068011 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.067880 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.068011 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.067915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04764b3-e96d-4a35-8c3a-14a1c7c86599-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.068625 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.068607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04764b3-e96d-4a35-8c3a-14a1c7c86599-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.071135 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.071107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.071263 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.071141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04764b3-e96d-4a35-8c3a-14a1c7c86599-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.075986 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.075960 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg2c5\" (UniqueName: \"kubernetes.io/projected/a04764b3-e96d-4a35-8c3a-14a1c7c86599-kube-api-access-sg2c5\") pod \"prometheus-operator-5676c8c784-md289\" (UID: \"a04764b3-e96d-4a35-8c3a-14a1c7c86599\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.143595 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.143565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" Apr 22 13:22:10.169408 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.169348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9r9bf" event={"ID":"e1257ae3-f69b-4b5c-b3ff-2400607495ed","Type":"ContainerStarted","Data":"ef380b79060838c7732e0bd8c550342e17ba0a8c45a62860868cf823bded9d0b"} Apr 22 13:22:10.169408 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.169397 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9r9bf" event={"ID":"e1257ae3-f69b-4b5c-b3ff-2400607495ed","Type":"ContainerStarted","Data":"cde987378d9e66016904473c7a3bd274b6b4f3afc93416371fc6eb986f7affff"} Apr 22 13:22:10.170663 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.170633 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bh9wx" event={"ID":"15464d00-15bf-4f8f-a47f-ffeac13f32c5","Type":"ContainerStarted","Data":"1b45480bf2eddff2062f5b64905147edf39b803524f1a46f78e954b85cdd519e"} Apr 22 13:22:10.171869 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.171843 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh69k" event={"ID":"352d4472-2ff1-4835-b7bb-78277c591127","Type":"ContainerStarted","Data":"6c135af2ba52bacb3485e8b3f1131fe0e3d2f5b9666e3b166c42f739769531a4"} Apr 22 13:22:10.304280 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.304033 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-md289"] Apr 22 13:22:10.308226 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:10.308194 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04764b3_e96d_4a35_8c3a_14a1c7c86599.slice/crio-a63df360ac13027be9a384caaca987865269f4f1d33335b1e7b9f60e0b7b6822 WatchSource:0}: Error finding container a63df360ac13027be9a384caaca987865269f4f1d33335b1e7b9f60e0b7b6822: Status 404 returned error can't find the container with id a63df360ac13027be9a384caaca987865269f4f1d33335b1e7b9f60e0b7b6822 Apr 22 13:22:10.943248 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.943213 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:10.943436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.943212 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:10.946656 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.946637 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bxmrr\"" Apr 22 13:22:10.946781 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.946670 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8cjvt\"" Apr 22 13:22:10.946781 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.946686 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 13:22:10.946781 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.946640 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 13:22:10.946781 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:10.946720 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 13:22:11.175010 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.174972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" event={"ID":"a04764b3-e96d-4a35-8c3a-14a1c7c86599","Type":"ContainerStarted","Data":"a63df360ac13027be9a384caaca987865269f4f1d33335b1e7b9f60e0b7b6822"} Apr 22 13:22:11.393515 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.393482 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:22:11.415976 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.415898 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:22:11.416132 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.416037 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.418463 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.418439 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 13:22:11.419276 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419192 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 13:22:11.419276 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419203 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rgk2k\"" Apr 22 13:22:11.419430 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419291 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 13:22:11.419430 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419302 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 13:22:11.419430 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419346 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 13:22:11.419632 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419616 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 13:22:11.419775 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.419760 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 13:22:11.577214 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577177 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.577399 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.577399 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577260 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.577399 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.577533 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.577533 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.577482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqnjb\" (UniqueName: \"kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678200 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678200 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678184 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678200 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqnjb\" (UniqueName: \"kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678487 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678487 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678487 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678961 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.678961 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.678934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.681451 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.681428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.681451 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.681440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.686397 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.686262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.686926 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.686879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqnjb\" (UniqueName: \"kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb\") pod \"console-9ff89f9c9-wr7xw\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:11.727613 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:11.727584 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:12.889494 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:12.889054 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:22:12.889494 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:12.889382 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb9aae44_f696_4033_bd8c_dde682a39a7f.slice/crio-2e5b3ca4cf7f1aa59725fb0339b937fa1ddecd87553e27e213dbadb2ca0890fc WatchSource:0}: Error finding container 2e5b3ca4cf7f1aa59725fb0339b937fa1ddecd87553e27e213dbadb2ca0890fc: Status 404 returned error can't find the container with id 2e5b3ca4cf7f1aa59725fb0339b937fa1ddecd87553e27e213dbadb2ca0890fc Apr 22 13:22:13.181447 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.181409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ff89f9c9-wr7xw" event={"ID":"db9aae44-f696-4033-bd8c-dde682a39a7f","Type":"ContainerStarted","Data":"2e5b3ca4cf7f1aa59725fb0339b937fa1ddecd87553e27e213dbadb2ca0890fc"} Apr 22 13:22:13.183073 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.183046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9r9bf" event={"ID":"e1257ae3-f69b-4b5c-b3ff-2400607495ed","Type":"ContainerStarted","Data":"409d0eea1e710bc00e42d3772b0d9ca5386ec6406328bfdb481a7de962ea3ab5"} Apr 22 13:22:13.184302 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.184266 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" event={"ID":"a04764b3-e96d-4a35-8c3a-14a1c7c86599","Type":"ContainerStarted","Data":"ef6006bf87dbf75e5434b2bdcd09cc1cb88d1955cb1c7928ccf0e68fc08a99e2"} Apr 22 13:22:13.185503 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.185475 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bh9wx" event={"ID":"15464d00-15bf-4f8f-a47f-ffeac13f32c5","Type":"ContainerStarted","Data":"b1e82a247f5bcad9c4eac6fdb5f7a11ff5fc169d01cfc43c2198cc84099306b9"} Apr 22 13:22:13.186725 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.186702 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh69k" event={"ID":"352d4472-2ff1-4835-b7bb-78277c591127","Type":"ContainerStarted","Data":"1c7ac6135ea4dd217964e14a368d29f0bbebfb667de55480bc38388c72f5b86f"} Apr 22 13:22:13.201120 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:13.201072 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bh9wx" podStartSLOduration=1.200451372 podStartE2EDuration="4.20105578s" podCreationTimestamp="2026-04-22 13:22:09 +0000 UTC" firstStartedPulling="2026-04-22 13:22:09.751254593 +0000 UTC m=+46.404358355" lastFinishedPulling="2026-04-22 13:22:12.751859004 +0000 UTC m=+49.404962763" observedRunningTime="2026-04-22 13:22:13.200770061 +0000 UTC m=+49.853873843" watchObservedRunningTime="2026-04-22 13:22:13.20105578 +0000 UTC m=+49.854159563" Apr 22 13:22:14.194488 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:14.194446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" event={"ID":"a04764b3-e96d-4a35-8c3a-14a1c7c86599","Type":"ContainerStarted","Data":"406db9c4eaf65205e5fc64659ac0bffe374acb4775ace7e33837955c9d1d3d15"} Apr 22 13:22:14.196426 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:14.196384 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh69k" event={"ID":"352d4472-2ff1-4835-b7bb-78277c591127","Type":"ContainerStarted","Data":"79c035e8c70ee546e4c4d9ed80f9e099b569a9ba859e547380f9f2d4d3001b7e"} Apr 22 13:22:14.196749 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:14.196727 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:14.211778 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:14.211737 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-md289" podStartSLOduration=2.485790379 podStartE2EDuration="5.211723319s" podCreationTimestamp="2026-04-22 13:22:09 +0000 UTC" firstStartedPulling="2026-04-22 13:22:10.31066715 +0000 UTC m=+46.963770921" lastFinishedPulling="2026-04-22 13:22:13.036600099 +0000 UTC m=+49.689703861" observedRunningTime="2026-04-22 13:22:14.210595225 +0000 UTC m=+50.863699009" watchObservedRunningTime="2026-04-22 13:22:14.211723319 +0000 UTC m=+50.864827118" Apr 22 13:22:14.226732 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:14.226683 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gh69k" podStartSLOduration=2.227852769 podStartE2EDuration="5.226671457s" podCreationTimestamp="2026-04-22 13:22:09 +0000 UTC" firstStartedPulling="2026-04-22 13:22:09.751513086 +0000 UTC m=+46.404616845" lastFinishedPulling="2026-04-22 13:22:12.750331769 +0000 UTC m=+49.403435533" observedRunningTime="2026-04-22 13:22:14.226452101 +0000 UTC m=+50.879555883" watchObservedRunningTime="2026-04-22 13:22:14.226671457 +0000 UTC m=+50.879775239" Apr 22 13:22:15.201960 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:15.201917 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9r9bf" event={"ID":"e1257ae3-f69b-4b5c-b3ff-2400607495ed","Type":"ContainerStarted","Data":"6cf323350fb84c852071f3f04320e8bca5900175f0853407fc057f17531977c6"} Apr 22 13:22:15.220931 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:15.220873 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9r9bf" podStartSLOduration=1.491412434 podStartE2EDuration="6.220855123s" podCreationTimestamp="2026-04-22 13:22:09 +0000 UTC" firstStartedPulling="2026-04-22 13:22:09.863852536 +0000 UTC m=+46.516956301" lastFinishedPulling="2026-04-22 13:22:14.593295224 +0000 UTC m=+51.246398990" observedRunningTime="2026-04-22 13:22:15.219412388 +0000 UTC m=+51.872516187" watchObservedRunningTime="2026-04-22 13:22:15.220855123 +0000 UTC m=+51.873958905" Apr 22 13:22:16.198974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.198790 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jm87x"] Apr 22 13:22:16.250128 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.250096 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srp4x"] Apr 22 13:22:16.250528 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.250175 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.252902 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.252880 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 13:22:16.253350 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.253320 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x2x6b\"" Apr 22 13:22:16.253551 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.253508 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 13:22:16.253713 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.253696 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 13:22:16.259295 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.259276 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srp4x"] Apr 22 13:22:16.259412 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.259399 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.263530 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.263514 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 13:22:16.264283 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.264267 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 13:22:16.265347 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.265239 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-9tgbc\"" Apr 22 13:22:16.265982 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.265968 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 13:22:16.311737 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdcb\" (UniqueName: \"kubernetes.io/projected/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-kube-api-access-qtdcb\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.311737 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311773 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/61c05769-3e86-4c56-836b-6696f02722af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-metrics-client-ca\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-textfile\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311911 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-sys\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312015 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.311995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-root\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312031 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312066 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312115 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6t74\" (UniqueName: \"kubernetes.io/projected/61c05769-3e86-4c56-836b-6696f02722af-kube-api-access-x6t74\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312194 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312219 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-wtmp\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.312392 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.312256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-accelerators-collector-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.412820 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-textfile\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.412820 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-sys\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-root\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412926 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-sys\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.412952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6t74\" (UniqueName: \"kubernetes.io/projected/61c05769-3e86-4c56-836b-6696f02722af-kube-api-access-x6t74\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-root\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-wtmp\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:16.413142 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-accelerators-collector-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:16.413222 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls podName:61c05769-3e86-4c56-836b-6696f02722af nodeName:}" failed. No retries permitted until 2026-04-22 13:22:16.91320258 +0000 UTC m=+53.566306346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-srp4x" (UID: "61c05769-3e86-4c56-836b-6696f02722af") : secret "kube-state-metrics-tls" not found Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:16.413228 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdcb\" (UniqueName: \"kubernetes.io/projected/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-kube-api-access-qtdcb\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:16.413285 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls podName:e6cb5329-b117-42b2-8a20-2d8bbd3ccc40 nodeName:}" failed. No retries permitted until 2026-04-22 13:22:16.913267124 +0000 UTC m=+53.566370885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls") pod "node-exporter-jm87x" (UID: "e6cb5329-b117-42b2-8a20-2d8bbd3ccc40") : secret "node-exporter-tls" not found Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413386 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413356 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/61c05769-3e86-4c56-836b-6696f02722af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-wtmp\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-textfile\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-metrics-client-ca\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.413904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.413796 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/61c05769-3e86-4c56-836b-6696f02722af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.416974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.416951 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.416974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.416967 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.420086 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.420063 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-metrics-client-ca\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.420212 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.420129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-accelerators-collector-config\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.425144 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.425114 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdcb\" (UniqueName: \"kubernetes.io/projected/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-kube-api-access-qtdcb\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.425996 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.425973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6t74\" (UniqueName: \"kubernetes.io/projected/61c05769-3e86-4c56-836b-6696f02722af-kube-api-access-x6t74\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.426748 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.426705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.917077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.917041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:16.917305 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.917088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.919619 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.919596 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e6cb5329-b117-42b2-8a20-2d8bbd3ccc40-node-exporter-tls\") pod \"node-exporter-jm87x\" (UID: \"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40\") " pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:16.919724 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:16.919630 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61c05769-3e86-4c56-836b-6696f02722af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srp4x\" (UID: \"61c05769-3e86-4c56-836b-6696f02722af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:17.159753 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.159725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jm87x" Apr 22 13:22:17.167557 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.167509 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" Apr 22 13:22:17.167827 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:17.167797 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cb5329_b117_42b2_8a20_2d8bbd3ccc40.slice/crio-ff5618a0d28ac9de2593853ebbad06b83cebe450299173aa7a2721dcf8c0abcb WatchSource:0}: Error finding container ff5618a0d28ac9de2593853ebbad06b83cebe450299173aa7a2721dcf8c0abcb: Status 404 returned error can't find the container with id ff5618a0d28ac9de2593853ebbad06b83cebe450299173aa7a2721dcf8c0abcb Apr 22 13:22:17.209589 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.209542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jm87x" event={"ID":"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40","Type":"ContainerStarted","Data":"ff5618a0d28ac9de2593853ebbad06b83cebe450299173aa7a2721dcf8c0abcb"} Apr 22 13:22:17.210992 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.210960 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ff89f9c9-wr7xw" event={"ID":"db9aae44-f696-4033-bd8c-dde682a39a7f","Type":"ContainerStarted","Data":"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845"} Apr 22 13:22:17.231973 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.231757 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9ff89f9c9-wr7xw" podStartSLOduration=2.7848011 podStartE2EDuration="6.231739974s" podCreationTimestamp="2026-04-22 13:22:11 +0000 UTC" firstStartedPulling="2026-04-22 13:22:13.033346924 +0000 UTC m=+49.686450686" lastFinishedPulling="2026-04-22 13:22:16.480285801 +0000 UTC m=+53.133389560" observedRunningTime="2026-04-22 13:22:17.231401487 +0000 UTC m=+53.884505269" watchObservedRunningTime="2026-04-22 13:22:17.231739974 +0000 UTC m=+53.884843757" Apr 22 13:22:17.292933 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:17.292896 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srp4x"] Apr 22 13:22:17.297079 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:17.297047 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61c05769_3e86_4c56_836b_6696f02722af.slice/crio-dc351a013ec83b29f9b8a794135861c9b3ab4089eb11115dcbbe2f0062ed3db9 WatchSource:0}: Error finding container dc351a013ec83b29f9b8a794135861c9b3ab4089eb11115dcbbe2f0062ed3db9: Status 404 returned error can't find the container with id dc351a013ec83b29f9b8a794135861c9b3ab4089eb11115dcbbe2f0062ed3db9 Apr 22 13:22:18.215011 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:18.214972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" event={"ID":"61c05769-3e86-4c56-836b-6696f02722af","Type":"ContainerStarted","Data":"dc351a013ec83b29f9b8a794135861c9b3ab4089eb11115dcbbe2f0062ed3db9"} Apr 22 13:22:19.219602 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.219575 2567 generic.go:358] "Generic (PLEG): container finished" podID="e6cb5329-b117-42b2-8a20-2d8bbd3ccc40" containerID="2544d4a25e117e097b263a230011cd95dc11d330df884a204eefb9f627577dda" exitCode=0 Apr 22 13:22:19.219977 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.219663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jm87x" event={"ID":"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40","Type":"ContainerDied","Data":"2544d4a25e117e097b263a230011cd95dc11d330df884a204eefb9f627577dda"} Apr 22 13:22:19.221237 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.221207 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" event={"ID":"61c05769-3e86-4c56-836b-6696f02722af","Type":"ContainerStarted","Data":"e564748a81f18ca02626ff08337386914db64c0fcb2a6b9e07454baef29a4ce3"} Apr 22 13:22:19.297829 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.297795 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-86696546f4-v6h8j"] Apr 22 13:22:19.317456 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.317390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-86696546f4-v6h8j"] Apr 22 13:22:19.317569 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.317534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.319987 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320025 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320307 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320379 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320405 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jbqbh\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320438 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-43bkp9c6giigf\"" Apr 22 13:22:19.320566 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.320485 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 13:22:19.341547 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-grpc-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341711 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341553 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341711 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341711 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341711 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsdl\" (UniqueName: \"kubernetes.io/projected/8e7f151a-ce1d-4348-8440-269a9010bb2b-kube-api-access-hpsdl\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341962 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341719 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7f151a-ce1d-4348-8440-269a9010bb2b-metrics-client-ca\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341962 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.341962 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.341810 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442302 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442274 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsdl\" (UniqueName: \"kubernetes.io/projected/8e7f151a-ce1d-4348-8440-269a9010bb2b-kube-api-access-hpsdl\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7f151a-ce1d-4348-8440-269a9010bb2b-metrics-client-ca\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442411 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-grpc-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442649 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442649 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.442649 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.442514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.443534 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.443483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e7f151a-ce1d-4348-8440-269a9010bb2b-metrics-client-ca\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446220 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446190 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446618 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446759 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446736 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-grpc-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446848 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446832 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446905 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446869 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.446981 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.446964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e7f151a-ce1d-4348-8440-269a9010bb2b-secret-thanos-querier-tls\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.449971 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.449949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsdl\" (UniqueName: \"kubernetes.io/projected/8e7f151a-ce1d-4348-8440-269a9010bb2b-kube-api-access-hpsdl\") pod \"thanos-querier-86696546f4-v6h8j\" (UID: \"8e7f151a-ce1d-4348-8440-269a9010bb2b\") " pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.630382 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.630304 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:19.748624 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:19.748596 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-86696546f4-v6h8j"] Apr 22 13:22:19.752500 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:19.752473 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7f151a_ce1d_4348_8440_269a9010bb2b.slice/crio-a057a6793a932826b1e2af2cfdd4f131f0d403bc982590b191b1daf9c0c557c9 WatchSource:0}: Error finding container a057a6793a932826b1e2af2cfdd4f131f0d403bc982590b191b1daf9c0c557c9: Status 404 returned error can't find the container with id a057a6793a932826b1e2af2cfdd4f131f0d403bc982590b191b1daf9c0c557c9 Apr 22 13:22:20.226154 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.226118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jm87x" event={"ID":"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40","Type":"ContainerStarted","Data":"ca805b0e64179b6503ad53413cb3c9c651ce7bde16e0a6df9eb87e1cfbfacc31"} Apr 22 13:22:20.226589 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.226160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jm87x" event={"ID":"e6cb5329-b117-42b2-8a20-2d8bbd3ccc40","Type":"ContainerStarted","Data":"f151f3d0fcd16320234aeca4008c45a607eb1291e66d443415e7cc462f2b365b"} Apr 22 13:22:20.228060 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.227991 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" event={"ID":"61c05769-3e86-4c56-836b-6696f02722af","Type":"ContainerStarted","Data":"0ee1f0b53b429000fca6aa7dcad7aeba0e76e61a0a45adf795d5f7037100c1df"} Apr 22 13:22:20.228199 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.228067 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" event={"ID":"61c05769-3e86-4c56-836b-6696f02722af","Type":"ContainerStarted","Data":"c844692f94a5aee8379f7ebf8548a3b5a184ac877199fb712c0acab264726a88"} Apr 22 13:22:20.228947 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.228927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"a057a6793a932826b1e2af2cfdd4f131f0d403bc982590b191b1daf9c0c557c9"} Apr 22 13:22:20.246491 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.246411 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jm87x" podStartSLOduration=3.132635405 podStartE2EDuration="4.246399457s" podCreationTimestamp="2026-04-22 13:22:16 +0000 UTC" firstStartedPulling="2026-04-22 13:22:17.171213871 +0000 UTC m=+53.824317629" lastFinishedPulling="2026-04-22 13:22:18.284977913 +0000 UTC m=+54.938081681" observedRunningTime="2026-04-22 13:22:20.244930232 +0000 UTC m=+56.898034013" watchObservedRunningTime="2026-04-22 13:22:20.246399457 +0000 UTC m=+56.899503238" Apr 22 13:22:20.264000 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:20.263960 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-srp4x" podStartSLOduration=2.466867097 podStartE2EDuration="4.263948475s" podCreationTimestamp="2026-04-22 13:22:16 +0000 UTC" firstStartedPulling="2026-04-22 13:22:17.298995451 +0000 UTC m=+53.952099213" lastFinishedPulling="2026-04-22 13:22:19.096076828 +0000 UTC m=+55.749180591" observedRunningTime="2026-04-22 13:22:20.26298447 +0000 UTC m=+56.916088251" watchObservedRunningTime="2026-04-22 13:22:20.263948475 +0000 UTC m=+56.917052292" Apr 22 13:22:21.004214 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.004184 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx"] Apr 22 13:22:21.007667 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.007648 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:21.009838 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.009820 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 13:22:21.009923 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.009838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cfxn4\"" Apr 22 13:22:21.017508 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.017487 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx"] Apr 22 13:22:21.057827 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.057788 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmnfx\" (UID: \"d883f20c-5cff-43c8-ac4c-ca28964b1e8e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:21.159127 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.159089 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmnfx\" (UID: \"d883f20c-5cff-43c8-ac4c-ca28964b1e8e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:21.159339 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:21.159271 2567 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 13:22:21.159406 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:22:21.159352 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert podName:d883f20c-5cff-43c8-ac4c-ca28964b1e8e nodeName:}" failed. No retries permitted until 2026-04-22 13:22:21.659331068 +0000 UTC m=+58.312434828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-vmnfx" (UID: "d883f20c-5cff-43c8-ac4c-ca28964b1e8e") : secret "monitoring-plugin-cert" not found Apr 22 13:22:21.663350 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.663310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmnfx\" (UID: \"d883f20c-5cff-43c8-ac4c-ca28964b1e8e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:21.666048 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.666018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d883f20c-5cff-43c8-ac4c-ca28964b1e8e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vmnfx\" (UID: \"d883f20c-5cff-43c8-ac4c-ca28964b1e8e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:21.727737 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.727699 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:21.727893 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.727755 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:21.732923 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.732901 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:21.917232 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:21.917128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:22.056050 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.056019 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx"] Apr 22 13:22:22.058853 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:22.058824 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd883f20c_5cff_43c8_ac4c_ca28964b1e8e.slice/crio-7813a6cc278836b060e12e55c7b7ee85d44d106f1dd625abfbc3ebba5e26d1f9 WatchSource:0}: Error finding container 7813a6cc278836b060e12e55c7b7ee85d44d106f1dd625abfbc3ebba5e26d1f9: Status 404 returned error can't find the container with id 7813a6cc278836b060e12e55c7b7ee85d44d106f1dd625abfbc3ebba5e26d1f9 Apr 22 13:22:22.134502 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.134476 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n7z2m" Apr 22 13:22:22.237522 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.237467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"09a0f821ee44ac514732799ea36a250318cacfb978e0e45723b3676bfef0d20f"} Apr 22 13:22:22.237522 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.237514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"4c71e830e1b40424ec2fa062de5dbdfca8b571eeb3a8fb66b7c163f8f30f98c9"} Apr 22 13:22:22.237522 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.237527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"4ca1bd06449af6df93a8bdf39975cbe82b0a6b194aa0dff18638b6a9b4cf15f4"} Apr 22 13:22:22.238892 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.238858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" event={"ID":"d883f20c-5cff-43c8-ac4c-ca28964b1e8e","Type":"ContainerStarted","Data":"7813a6cc278836b060e12e55c7b7ee85d44d106f1dd625abfbc3ebba5e26d1f9"} Apr 22 13:22:22.247647 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:22.247615 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:22:24.204923 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.204895 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gh69k" Apr 22 13:22:24.246782 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.246746 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" event={"ID":"d883f20c-5cff-43c8-ac4c-ca28964b1e8e","Type":"ContainerStarted","Data":"06ccd2b64da8a48c4cf13f83b4d8d997558a8e383799145a40e8528c8ee08514"} Apr 22 13:22:24.246976 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.246934 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:24.249797 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.249775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"67f4d6f57f997c706a24488d604c787cffe9f1ba7d8e5d1a045f857932a60372"} Apr 22 13:22:24.249797 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.249798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"9d50c98a723d5d92cdf885087ad4a54264c3ec71acef7b63c6d1a6050723a068"} Apr 22 13:22:24.249981 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.249810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" event={"ID":"8e7f151a-ce1d-4348-8440-269a9010bb2b","Type":"ContainerStarted","Data":"060e645d2dc3aa28882f3501c4b47b23a4038163fdcef1c97dd4a14ceb00ee4a"} Apr 22 13:22:24.250033 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.249996 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:24.252216 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.252198 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" Apr 22 13:22:24.281676 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.281637 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" podStartSLOduration=1.459819753 podStartE2EDuration="5.281600625s" podCreationTimestamp="2026-04-22 13:22:19 +0000 UTC" firstStartedPulling="2026-04-22 13:22:19.754358541 +0000 UTC m=+56.407462300" lastFinishedPulling="2026-04-22 13:22:23.576139413 +0000 UTC m=+60.229243172" observedRunningTime="2026-04-22 13:22:24.28029222 +0000 UTC m=+60.933396002" watchObservedRunningTime="2026-04-22 13:22:24.281600625 +0000 UTC m=+60.934704406" Apr 22 13:22:24.281789 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:24.281747 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vmnfx" podStartSLOduration=2.763802762 podStartE2EDuration="4.281741999s" podCreationTimestamp="2026-04-22 13:22:20 +0000 UTC" firstStartedPulling="2026-04-22 13:22:22.060859187 +0000 UTC m=+58.713962946" lastFinishedPulling="2026-04-22 13:22:23.578798411 +0000 UTC m=+60.231902183" observedRunningTime="2026-04-22 13:22:24.26113567 +0000 UTC m=+60.914239452" watchObservedRunningTime="2026-04-22 13:22:24.281741999 +0000 UTC m=+60.934845779" Apr 22 13:22:27.590379 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.590335 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:22:27.621107 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.621079 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:22:27.621285 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.621207 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.627862 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.627838 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 13:22:27.720609 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720778 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720622 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720778 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22p7p\" (UniqueName: \"kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720778 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720730 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720882 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720882 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.720882 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.720858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.821871 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.821838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822104 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822271 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822418 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822531 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22p7p\" (UniqueName: \"kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822660 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.822801 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.822789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.823766 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.823738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.824505 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.824482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.825908 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.825883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.826029 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.826006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.827795 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.827769 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.827873 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.827849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.833596 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.833575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22p7p\" (UniqueName: \"kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p\") pod \"console-869c86d7c6-rcczx\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:27.930025 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:27.929950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:28.054754 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:28.054723 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:22:28.057505 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:28.057473 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e2bc8a_821b_4020_b034_bd0f6d378f4b.slice/crio-14b99070e5844f2912997031e4a3f6768671ecc1f17979d9fcae5bb29054b590 WatchSource:0}: Error finding container 14b99070e5844f2912997031e4a3f6768671ecc1f17979d9fcae5bb29054b590: Status 404 returned error can't find the container with id 14b99070e5844f2912997031e4a3f6768671ecc1f17979d9fcae5bb29054b590 Apr 22 13:22:28.263308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:28.263271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-869c86d7c6-rcczx" event={"ID":"d6e2bc8a-821b-4020-b034-bd0f6d378f4b","Type":"ContainerStarted","Data":"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d"} Apr 22 13:22:28.263308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:28.263310 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-869c86d7c6-rcczx" event={"ID":"d6e2bc8a-821b-4020-b034-bd0f6d378f4b","Type":"ContainerStarted","Data":"14b99070e5844f2912997031e4a3f6768671ecc1f17979d9fcae5bb29054b590"} Apr 22 13:22:28.281241 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:28.281160 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-869c86d7c6-rcczx" podStartSLOduration=1.281144809 podStartE2EDuration="1.281144809s" podCreationTimestamp="2026-04-22 13:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:22:28.279752192 +0000 UTC m=+64.932855974" watchObservedRunningTime="2026-04-22 13:22:28.281144809 +0000 UTC m=+64.934248588" Apr 22 13:22:29.739310 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.739275 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:29.741819 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.741801 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 13:22:29.751640 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.751617 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8-metrics-certs\") pod \"network-metrics-daemon-kbkn6\" (UID: \"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8\") " pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:29.840211 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.840156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:29.842552 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.842535 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 13:22:29.853061 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.853043 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 13:22:29.854840 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.854827 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bxmrr\"" Apr 22 13:22:29.863424 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.863400 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbkn6" Apr 22 13:22:29.863824 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.863801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5blw\" (UniqueName: \"kubernetes.io/projected/2887fb17-5e94-487b-9353-de32227a0d91-kube-api-access-g5blw\") pod \"network-check-target-jklhk\" (UID: \"2887fb17-5e94-487b-9353-de32227a0d91\") " pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:29.978610 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:29.978579 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbkn6"] Apr 22 13:22:29.981755 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:29.981724 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecfd1aa_fe6d_4a7e_a343_9807fd0d3bc8.slice/crio-6c1700f2e9ecf16bfd8f81753cb5f102165bbe6da1151a965b5e8ad4e0230048 WatchSource:0}: Error finding container 6c1700f2e9ecf16bfd8f81753cb5f102165bbe6da1151a965b5e8ad4e0230048: Status 404 returned error can't find the container with id 6c1700f2e9ecf16bfd8f81753cb5f102165bbe6da1151a965b5e8ad4e0230048 Apr 22 13:22:30.160032 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:30.160004 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8cjvt\"" Apr 22 13:22:30.168246 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:30.168229 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:30.258441 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:30.258412 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-86696546f4-v6h8j" Apr 22 13:22:30.270273 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:30.270244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbkn6" event={"ID":"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8","Type":"ContainerStarted","Data":"6c1700f2e9ecf16bfd8f81753cb5f102165bbe6da1151a965b5e8ad4e0230048"} Apr 22 13:22:30.294560 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:30.294528 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jklhk"] Apr 22 13:22:30.298713 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:22:30.298681 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2887fb17_5e94_487b_9353_de32227a0d91.slice/crio-7a2e73cf37945f5629669b655f215924264a1962798cacaeccad7ce050a0bbd3 WatchSource:0}: Error finding container 7a2e73cf37945f5629669b655f215924264a1962798cacaeccad7ce050a0bbd3: Status 404 returned error can't find the container with id 7a2e73cf37945f5629669b655f215924264a1962798cacaeccad7ce050a0bbd3 Apr 22 13:22:31.274874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:31.274835 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbkn6" event={"ID":"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8","Type":"ContainerStarted","Data":"1a6b1a3c1127323631f5521693291ad91527e7aef4aa610fcf44a83dcb67f07e"} Apr 22 13:22:31.274874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:31.274877 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbkn6" event={"ID":"1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8","Type":"ContainerStarted","Data":"5ffd82badabf1c2db6ce92d1a66aa9dc105f55f0b5691946769c38285349f2a3"} Apr 22 13:22:31.276053 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:31.276026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jklhk" event={"ID":"2887fb17-5e94-487b-9353-de32227a0d91","Type":"ContainerStarted","Data":"7a2e73cf37945f5629669b655f215924264a1962798cacaeccad7ce050a0bbd3"} Apr 22 13:22:31.294764 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:31.294137 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kbkn6" podStartSLOduration=66.358758312 podStartE2EDuration="1m7.294120987s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:22:29.98367741 +0000 UTC m=+66.636781170" lastFinishedPulling="2026-04-22 13:22:30.919040076 +0000 UTC m=+67.572143845" observedRunningTime="2026-04-22 13:22:31.292594486 +0000 UTC m=+67.945698268" watchObservedRunningTime="2026-04-22 13:22:31.294120987 +0000 UTC m=+67.947224772" Apr 22 13:22:34.287842 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:34.287802 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jklhk" event={"ID":"2887fb17-5e94-487b-9353-de32227a0d91","Type":"ContainerStarted","Data":"2b399cb80615412070e8ebafecf16aeba9ae064bf49b617b751dd78e845463e2"} Apr 22 13:22:34.288261 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:34.288012 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:22:34.310978 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:34.310928 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jklhk" podStartSLOduration=67.145262693 podStartE2EDuration="1m10.310913229s" podCreationTimestamp="2026-04-22 13:21:24 +0000 UTC" firstStartedPulling="2026-04-22 13:22:30.300670499 +0000 UTC m=+66.953774258" lastFinishedPulling="2026-04-22 13:22:33.466321031 +0000 UTC m=+70.119424794" observedRunningTime="2026-04-22 13:22:34.309117536 +0000 UTC m=+70.962221328" watchObservedRunningTime="2026-04-22 13:22:34.310913229 +0000 UTC m=+70.964017013" Apr 22 13:22:37.930478 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:37.930356 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:37.930478 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:37.930410 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:37.935198 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:37.935153 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:38.303155 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:38.303121 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:22:38.353615 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:22:38.353581 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:23:03.372110 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.372042 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9ff89f9c9-wr7xw" podUID="db9aae44-f696-4033-bd8c-dde682a39a7f" containerName="console" containerID="cri-o://eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845" gracePeriod=15 Apr 22 13:23:03.602785 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.602764 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9ff89f9c9-wr7xw_db9aae44-f696-4033-bd8c-dde682a39a7f/console/0.log" Apr 22 13:23:03.602900 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.602831 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:23:03.607802 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607784 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqnjb\" (UniqueName: \"kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.607871 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607837 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.607871 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607869 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.607972 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607895 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.607972 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607922 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.607972 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.607950 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert\") pod \"db9aae44-f696-4033-bd8c-dde682a39a7f\" (UID: \"db9aae44-f696-4033-bd8c-dde682a39a7f\") " Apr 22 13:23:03.608409 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.608370 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config" (OuterVolumeSpecName: "console-config") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:23:03.608409 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.608388 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca" (OuterVolumeSpecName: "service-ca") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:23:03.608565 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.608422 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:23:03.610029 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.609990 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:23:03.610029 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.610015 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb" (OuterVolumeSpecName: "kube-api-access-jqnjb") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "kube-api-access-jqnjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:23:03.610143 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.610006 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db9aae44-f696-4033-bd8c-dde682a39a7f" (UID: "db9aae44-f696-4033-bd8c-dde682a39a7f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709022 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-service-ca\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709049 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-console-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709058 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-oauth-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709067 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db9aae44-f696-4033-bd8c-dde682a39a7f-oauth-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709077 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db9aae44-f696-4033-bd8c-dde682a39a7f-console-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:03.709092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:03.709086 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqnjb\" (UniqueName: \"kubernetes.io/projected/db9aae44-f696-4033-bd8c-dde682a39a7f-kube-api-access-jqnjb\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:23:04.365314 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365286 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9ff89f9c9-wr7xw_db9aae44-f696-4033-bd8c-dde682a39a7f/console/0.log" Apr 22 13:23:04.365489 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365325 2567 generic.go:358] "Generic (PLEG): container finished" podID="db9aae44-f696-4033-bd8c-dde682a39a7f" containerID="eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845" exitCode=2 Apr 22 13:23:04.365489 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365361 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ff89f9c9-wr7xw" event={"ID":"db9aae44-f696-4033-bd8c-dde682a39a7f","Type":"ContainerDied","Data":"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845"} Apr 22 13:23:04.365489 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9ff89f9c9-wr7xw" event={"ID":"db9aae44-f696-4033-bd8c-dde682a39a7f","Type":"ContainerDied","Data":"2e5b3ca4cf7f1aa59725fb0339b937fa1ddecd87553e27e213dbadb2ca0890fc"} Apr 22 13:23:04.365489 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365388 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9ff89f9c9-wr7xw" Apr 22 13:23:04.365489 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.365397 2567 scope.go:117] "RemoveContainer" containerID="eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845" Apr 22 13:23:04.372750 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.372735 2567 scope.go:117] "RemoveContainer" containerID="eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845" Apr 22 13:23:04.373013 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:23:04.372981 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845\": container with ID starting with eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845 not found: ID does not exist" containerID="eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845" Apr 22 13:23:04.373050 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.373012 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845"} err="failed to get container status \"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845\": rpc error: code = NotFound desc = could not find container \"eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845\": container with ID starting with eb91d020b99b5a809fe3814bfe017423f0e961624b16fcae11b21c129e28e845 not found: ID does not exist" Apr 22 13:23:04.380990 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.380967 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:23:04.384886 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:04.384860 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9ff89f9c9-wr7xw"] Apr 22 13:23:05.293878 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:05.293845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jklhk" Apr 22 13:23:05.947491 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:05.947456 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db9aae44-f696-4033-bd8c-dde682a39a7f" path="/var/lib/kubelet/pods/db9aae44-f696-4033-bd8c-dde682a39a7f/volumes" Apr 22 13:23:44.976987 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.976949 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:23:44.977479 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.977217 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db9aae44-f696-4033-bd8c-dde682a39a7f" containerName="console" Apr 22 13:23:44.977479 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.977229 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9aae44-f696-4033-bd8c-dde682a39a7f" containerName="console" Apr 22 13:23:44.977479 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.977278 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="db9aae44-f696-4033-bd8c-dde682a39a7f" containerName="console" Apr 22 13:23:44.980037 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.980017 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:44.994641 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:44.994617 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:23:45.128183 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128361 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128361 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128361 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128361 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128498 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128367 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.128498 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.128407 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229312 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229312 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229312 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229582 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229582 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229582 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.229582 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.229401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.230124 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.230097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.230277 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.230126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.230277 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.230140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.230541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.230520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.232239 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.232220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.232441 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.232420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.237211 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.237191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc\") pod \"console-977545878-lwm7h\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.288723 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.288690 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:45.403383 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.403353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:23:45.406056 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:23:45.406028 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e13b96_a4cd_41f2_bbad_824aba6f55ef.slice/crio-955f7d5ef474223aadbc479bdbe87d70d229c8b84a0978f8cd65b7368fca925f WatchSource:0}: Error finding container 955f7d5ef474223aadbc479bdbe87d70d229c8b84a0978f8cd65b7368fca925f: Status 404 returned error can't find the container with id 955f7d5ef474223aadbc479bdbe87d70d229c8b84a0978f8cd65b7368fca925f Apr 22 13:23:45.475242 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.475205 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-977545878-lwm7h" event={"ID":"96e13b96-a4cd-41f2-bbad-824aba6f55ef","Type":"ContainerStarted","Data":"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c"} Apr 22 13:23:45.475242 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.475248 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-977545878-lwm7h" event={"ID":"96e13b96-a4cd-41f2-bbad-824aba6f55ef","Type":"ContainerStarted","Data":"955f7d5ef474223aadbc479bdbe87d70d229c8b84a0978f8cd65b7368fca925f"} Apr 22 13:23:45.493800 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:45.493679 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-977545878-lwm7h" podStartSLOduration=1.493662944 podStartE2EDuration="1.493662944s" podCreationTimestamp="2026-04-22 13:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:23:45.493370569 +0000 UTC m=+142.146474350" watchObservedRunningTime="2026-04-22 13:23:45.493662944 +0000 UTC m=+142.146766723" Apr 22 13:23:55.289056 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:55.289025 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:55.289534 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:55.289186 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:55.294617 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:55.294589 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:55.508113 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:55.508085 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:23:55.563927 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:23:55.562334 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:24:20.583529 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.583474 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-869c86d7c6-rcczx" podUID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" containerName="console" containerID="cri-o://3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d" gracePeriod=15 Apr 22 13:24:20.810288 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.810266 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-869c86d7c6-rcczx_d6e2bc8a-821b-4020-b034-bd0f6d378f4b/console/0.log" Apr 22 13:24:20.810421 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.810326 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:24:20.899423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899347 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899393 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899423 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899421 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899684 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899470 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899684 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899487 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899684 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899518 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899684 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899553 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22p7p\" (UniqueName: \"kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p\") pod \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\" (UID: \"d6e2bc8a-821b-4020-b034-bd0f6d378f4b\") " Apr 22 13:24:20.899914 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899883 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:24:20.899974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899914 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config" (OuterVolumeSpecName: "console-config") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:24:20.899974 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899940 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:24:20.900080 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.899986 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:24:20.901652 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.901630 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p" (OuterVolumeSpecName: "kube-api-access-22p7p") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "kube-api-access-22p7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:24:20.901652 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.901633 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:24:20.901652 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:20.901645 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6e2bc8a-821b-4020-b034-bd0f6d378f4b" (UID: "d6e2bc8a-821b-4020-b034-bd0f6d378f4b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:24:21.000497 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000465 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000497 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000492 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-oauth-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000497 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000503 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-trusted-ca-bundle\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000512 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22p7p\" (UniqueName: \"kubernetes.io/projected/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-kube-api-access-22p7p\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000522 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-service-ca\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000532 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-oauth-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.000714 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.000541 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e2bc8a-821b-4020-b034-bd0f6d378f4b-console-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:24:21.575127 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575102 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-869c86d7c6-rcczx_d6e2bc8a-821b-4020-b034-bd0f6d378f4b/console/0.log" Apr 22 13:24:21.575315 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575142 2567 generic.go:358] "Generic (PLEG): container finished" podID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" containerID="3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d" exitCode=2 Apr 22 13:24:21.575315 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575182 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-869c86d7c6-rcczx" event={"ID":"d6e2bc8a-821b-4020-b034-bd0f6d378f4b","Type":"ContainerDied","Data":"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d"} Apr 22 13:24:21.575315 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575222 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-869c86d7c6-rcczx" Apr 22 13:24:21.575315 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575234 2567 scope.go:117] "RemoveContainer" containerID="3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d" Apr 22 13:24:21.575521 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.575224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-869c86d7c6-rcczx" event={"ID":"d6e2bc8a-821b-4020-b034-bd0f6d378f4b","Type":"ContainerDied","Data":"14b99070e5844f2912997031e4a3f6768671ecc1f17979d9fcae5bb29054b590"} Apr 22 13:24:21.583661 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.583642 2567 scope.go:117] "RemoveContainer" containerID="3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d" Apr 22 13:24:21.583951 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:24:21.583904 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d\": container with ID starting with 3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d not found: ID does not exist" containerID="3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d" Apr 22 13:24:21.583951 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.583928 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d"} err="failed to get container status \"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d\": rpc error: code = NotFound desc = could not find container \"3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d\": container with ID starting with 3126ab4963d7f53686e8ecda24fc14014fb1b8578535af3be04c62b791767e8d not found: ID does not exist" Apr 22 13:24:21.598659 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.598633 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:24:21.609706 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.609683 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-869c86d7c6-rcczx"] Apr 22 13:24:21.947537 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:21.947461 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" path="/var/lib/kubelet/pods/d6e2bc8a-821b-4020-b034-bd0f6d378f4b/volumes" Apr 22 13:24:56.955133 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.955102 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:24:56.955600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.955390 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" containerName="console" Apr 22 13:24:56.955600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.955401 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" containerName="console" Apr 22 13:24:56.955600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.955449 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6e2bc8a-821b-4020-b034-bd0f6d378f4b" containerName="console" Apr 22 13:24:56.958261 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.958242 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:56.975698 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:56.975672 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:24:57.054850 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.054826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055006 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.054872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055006 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.054893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055006 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.054974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055127 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.055021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055127 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.055042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2wr\" (UniqueName: \"kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.055127 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.055085 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155620 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155620 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155898 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2wr\" (UniqueName: \"kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155898 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155898 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155898 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.155898 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.155752 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.156477 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.156446 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.156592 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.156451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.156592 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.156571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.156689 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.156606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.158716 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.158692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.158716 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.158714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.164983 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.164964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2wr\" (UniqueName: \"kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr\") pod \"console-6d979f9785-8m4vg\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.267618 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.267586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:24:57.383244 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.383222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:24:57.385481 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:24:57.385459 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4832ef4_7313_458a_9ed5_3ff6e466d3eb.slice/crio-87632dda8be4dc4fe0ec955476182f9cea8b49cb5c338722884ca0c4b6861f59 WatchSource:0}: Error finding container 87632dda8be4dc4fe0ec955476182f9cea8b49cb5c338722884ca0c4b6861f59: Status 404 returned error can't find the container with id 87632dda8be4dc4fe0ec955476182f9cea8b49cb5c338722884ca0c4b6861f59 Apr 22 13:24:57.672375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.672289 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d979f9785-8m4vg" event={"ID":"b4832ef4-7313-458a-9ed5-3ff6e466d3eb","Type":"ContainerStarted","Data":"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e"} Apr 22 13:24:57.672375 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.672325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d979f9785-8m4vg" event={"ID":"b4832ef4-7313-458a-9ed5-3ff6e466d3eb","Type":"ContainerStarted","Data":"87632dda8be4dc4fe0ec955476182f9cea8b49cb5c338722884ca0c4b6861f59"} Apr 22 13:24:57.691261 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:24:57.691213 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d979f9785-8m4vg" podStartSLOduration=1.691198091 podStartE2EDuration="1.691198091s" podCreationTimestamp="2026-04-22 13:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:24:57.690464663 +0000 UTC m=+214.343568444" watchObservedRunningTime="2026-04-22 13:24:57.691198091 +0000 UTC m=+214.344301872" Apr 22 13:25:07.268437 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:07.268393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:25:07.268872 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:07.268454 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:25:07.273096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:07.273072 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:25:07.701157 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:07.701068 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:25:07.746312 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:07.746273 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:25:32.771253 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:32.771192 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-977545878-lwm7h" podUID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" containerName="console" containerID="cri-o://4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c" gracePeriod=15 Apr 22 13:25:33.010028 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.010006 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-977545878-lwm7h_96e13b96-a4cd-41f2-bbad-824aba6f55ef/console/0.log" Apr 22 13:25:33.010188 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.010066 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:25:33.121668 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121583 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121668 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121628 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121668 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121668 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121697 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121716 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121732 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.121930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.121767 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert\") pod \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\" (UID: \"96e13b96-a4cd-41f2-bbad-824aba6f55ef\") " Apr 22 13:25:33.122214 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.122115 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:25:33.122278 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.122213 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:25:33.122278 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.122241 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config" (OuterVolumeSpecName: "console-config") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:25:33.122350 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.122291 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:25:33.123880 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.123847 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:25:33.123985 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.123873 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:25:33.123985 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.123874 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc" (OuterVolumeSpecName: "kube-api-access-lqzfc") pod "96e13b96-a4cd-41f2-bbad-824aba6f55ef" (UID: "96e13b96-a4cd-41f2-bbad-824aba6f55ef"). InnerVolumeSpecName "kube-api-access-lqzfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:25:33.223018 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.222985 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-oauth-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223018 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223013 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223018 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223023 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-trusted-ca-bundle\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223264 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223033 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lqzfc\" (UniqueName: \"kubernetes.io/projected/96e13b96-a4cd-41f2-bbad-824aba6f55ef-kube-api-access-lqzfc\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223264 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223042 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e13b96-a4cd-41f2-bbad-824aba6f55ef-console-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223264 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223050 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-service-ca\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.223264 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.223059 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e13b96-a4cd-41f2-bbad-824aba6f55ef-oauth-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:25:33.767649 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-977545878-lwm7h_96e13b96-a4cd-41f2-bbad-824aba6f55ef/console/0.log" Apr 22 13:25:33.767807 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767661 2567 generic.go:358] "Generic (PLEG): container finished" podID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" containerID="4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c" exitCode=2 Apr 22 13:25:33.767807 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767720 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-977545878-lwm7h" Apr 22 13:25:33.767807 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767727 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-977545878-lwm7h" event={"ID":"96e13b96-a4cd-41f2-bbad-824aba6f55ef","Type":"ContainerDied","Data":"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c"} Apr 22 13:25:33.767807 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-977545878-lwm7h" event={"ID":"96e13b96-a4cd-41f2-bbad-824aba6f55ef","Type":"ContainerDied","Data":"955f7d5ef474223aadbc479bdbe87d70d229c8b84a0978f8cd65b7368fca925f"} Apr 22 13:25:33.767807 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.767775 2567 scope.go:117] "RemoveContainer" containerID="4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c" Apr 22 13:25:33.776046 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.775846 2567 scope.go:117] "RemoveContainer" containerID="4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c" Apr 22 13:25:33.776267 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:25:33.776086 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c\": container with ID starting with 4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c not found: ID does not exist" containerID="4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c" Apr 22 13:25:33.776267 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.776107 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c"} err="failed to get container status \"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c\": rpc error: code = NotFound desc = could not find container \"4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c\": container with ID starting with 4d829f79a21c3411e5e977e9e52f8f901530b74f7b1369ed5b36d90f9c1f2b6c not found: ID does not exist" Apr 22 13:25:33.787903 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.787876 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:25:33.791945 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.791923 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-977545878-lwm7h"] Apr 22 13:25:33.947456 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:33.947426 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" path="/var/lib/kubelet/pods/96e13b96-a4cd-41f2-bbad-824aba6f55ef/volumes" Apr 22 13:25:44.460883 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.460791 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2bm2k"] Apr 22 13:25:44.461274 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.461042 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" containerName="console" Apr 22 13:25:44.461274 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.461053 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" containerName="console" Apr 22 13:25:44.461274 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.461108 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="96e13b96-a4cd-41f2-bbad-824aba6f55ef" containerName="console" Apr 22 13:25:44.464126 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.464106 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.466429 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.466406 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 13:25:44.473154 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.473125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2bm2k"] Apr 22 13:25:44.599444 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.599409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-kubelet-config\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.599444 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.599446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-dbus\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.599666 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.599472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-original-pull-secret\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.700748 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.700712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-kubelet-config\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.700904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.700756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-dbus\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.700904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.700793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-original-pull-secret\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.700904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.700838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-kubelet-config\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.701016 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.700960 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-dbus\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.703001 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.702975 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1f3eb0-8599-43eb-a51e-a087b49b8c3a-original-pull-secret\") pod \"global-pull-secret-syncer-2bm2k\" (UID: \"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a\") " pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.773297 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.773272 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2bm2k" Apr 22 13:25:44.885966 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:44.885934 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2bm2k"] Apr 22 13:25:44.889137 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:25:44.889109 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1f3eb0_8599_43eb_a51e_a087b49b8c3a.slice/crio-256c0a20bafb1daea021aadb82aeac38b97b3ac9fd9b25762279d5529047d1f7 WatchSource:0}: Error finding container 256c0a20bafb1daea021aadb82aeac38b97b3ac9fd9b25762279d5529047d1f7: Status 404 returned error can't find the container with id 256c0a20bafb1daea021aadb82aeac38b97b3ac9fd9b25762279d5529047d1f7 Apr 22 13:25:45.802609 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:45.802571 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2bm2k" event={"ID":"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a","Type":"ContainerStarted","Data":"256c0a20bafb1daea021aadb82aeac38b97b3ac9fd9b25762279d5529047d1f7"} Apr 22 13:25:48.812318 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:48.812285 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2bm2k" event={"ID":"2e1f3eb0-8599-43eb-a51e-a087b49b8c3a","Type":"ContainerStarted","Data":"be01e316e12da16c1dc193daf46ce664269081156ffa8615924fd0a6738155d0"} Apr 22 13:25:48.828776 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:25:48.828718 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2bm2k" podStartSLOduration=1.402452226 podStartE2EDuration="4.828700496s" podCreationTimestamp="2026-04-22 13:25:44 +0000 UTC" firstStartedPulling="2026-04-22 13:25:44.890661788 +0000 UTC m=+261.543765547" lastFinishedPulling="2026-04-22 13:25:48.316910055 +0000 UTC m=+264.970013817" observedRunningTime="2026-04-22 13:25:48.828099952 +0000 UTC m=+265.481203732" watchObservedRunningTime="2026-04-22 13:25:48.828700496 +0000 UTC m=+265.481804280" Apr 22 13:26:16.942286 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:16.942255 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66"] Apr 22 13:26:16.945478 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:16.945459 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:16.947764 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:16.947744 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 13:26:16.948515 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:16.948486 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 13:26:16.948660 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:16.948646 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tjz2s\"" Apr 22 13:26:17.006105 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.006074 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66"] Apr 22 13:26:17.032729 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.032695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x9k\" (UniqueName: \"kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.032886 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.032746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.032886 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.032774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.133331 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.133300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.133331 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.133338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.133528 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.133386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x9k\" (UniqueName: \"kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.133709 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.133688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.133777 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.133761 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.141901 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.141870 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x9k\" (UniqueName: \"kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.253866 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.253786 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:17.368121 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.368095 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66"] Apr 22 13:26:17.370709 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:17.370676 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod811c1c07_68d8_491c_a078_b797d4e66ac2.slice/crio-7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037 WatchSource:0}: Error finding container 7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037: Status 404 returned error can't find the container with id 7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037 Apr 22 13:26:17.891887 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:17.891856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" event={"ID":"811c1c07-68d8-491c-a078-b797d4e66ac2","Type":"ContainerStarted","Data":"7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037"} Apr 22 13:26:22.908522 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:22.908425 2567 generic.go:358] "Generic (PLEG): container finished" podID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerID="baedf63bce31ffd128ba6acec26ff8cd7babde4a503b2f21fbad923cf045f304" exitCode=0 Apr 22 13:26:22.908522 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:22.908502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" event={"ID":"811c1c07-68d8-491c-a078-b797d4e66ac2","Type":"ContainerDied","Data":"baedf63bce31ffd128ba6acec26ff8cd7babde4a503b2f21fbad923cf045f304"} Apr 22 13:26:23.827042 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:23.827013 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:26:23.827310 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:23.827096 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:26:23.830301 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:23.830281 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 13:26:25.918293 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:25.918257 2567 generic.go:358] "Generic (PLEG): container finished" podID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerID="cc7ef61098248b1c2d2760ae936e1799cb9accb0a7f903e7b77329a5911134c8" exitCode=0 Apr 22 13:26:25.918729 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:25.918314 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" event={"ID":"811c1c07-68d8-491c-a078-b797d4e66ac2","Type":"ContainerDied","Data":"cc7ef61098248b1c2d2760ae936e1799cb9accb0a7f903e7b77329a5911134c8"} Apr 22 13:26:25.919280 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:25.919264 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:26:32.939637 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:32.939605 2567 generic.go:358] "Generic (PLEG): container finished" podID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerID="9fc005cd018b409050a9d754cdc2580ee1ec686fd61e8e00a3f388a0323c0585" exitCode=0 Apr 22 13:26:32.940005 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:32.939683 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" event={"ID":"811c1c07-68d8-491c-a078-b797d4e66ac2","Type":"ContainerDied","Data":"9fc005cd018b409050a9d754cdc2580ee1ec686fd61e8e00a3f388a0323c0585"} Apr 22 13:26:34.061602 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.061579 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:34.168184 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.168136 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8x9k\" (UniqueName: \"kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k\") pod \"811c1c07-68d8-491c-a078-b797d4e66ac2\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " Apr 22 13:26:34.168184 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.168191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util\") pod \"811c1c07-68d8-491c-a078-b797d4e66ac2\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " Apr 22 13:26:34.168419 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.168213 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle\") pod \"811c1c07-68d8-491c-a078-b797d4e66ac2\" (UID: \"811c1c07-68d8-491c-a078-b797d4e66ac2\") " Apr 22 13:26:34.168822 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.168793 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle" (OuterVolumeSpecName: "bundle") pod "811c1c07-68d8-491c-a078-b797d4e66ac2" (UID: "811c1c07-68d8-491c-a078-b797d4e66ac2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 13:26:34.170341 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.170314 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k" (OuterVolumeSpecName: "kube-api-access-w8x9k") pod "811c1c07-68d8-491c-a078-b797d4e66ac2" (UID: "811c1c07-68d8-491c-a078-b797d4e66ac2"). InnerVolumeSpecName "kube-api-access-w8x9k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:26:34.172719 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.172686 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util" (OuterVolumeSpecName: "util") pod "811c1c07-68d8-491c-a078-b797d4e66ac2" (UID: "811c1c07-68d8-491c-a078-b797d4e66ac2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 13:26:34.269534 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.269508 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8x9k\" (UniqueName: \"kubernetes.io/projected/811c1c07-68d8-491c-a078-b797d4e66ac2-kube-api-access-w8x9k\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:26:34.269534 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.269535 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-util\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:26:34.269708 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.269549 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/811c1c07-68d8-491c-a078-b797d4e66ac2-bundle\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:26:34.947269 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.947234 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" event={"ID":"811c1c07-68d8-491c-a078-b797d4e66ac2","Type":"ContainerDied","Data":"7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037"} Apr 22 13:26:34.947269 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.947260 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzbs66" Apr 22 13:26:34.947269 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:34.947271 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7773dcb5c1197ede34feebbe82b4276d1d0bb33ffa1331e9eef83d4e3375b037" Apr 22 13:26:40.038655 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038617 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw"] Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038937 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="util" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038950 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="util" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038965 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="pull" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038970 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="pull" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038981 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="extract" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.038987 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="extract" Apr 22 13:26:40.039096 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.039030 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="811c1c07-68d8-491c-a078-b797d4e66ac2" containerName="extract" Apr 22 13:26:40.041827 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.041812 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.044294 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.044272 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 13:26:40.044565 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.044538 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-rvmnv\"" Apr 22 13:26:40.044676 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.044635 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:26:40.056446 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.056426 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw"] Apr 22 13:26:40.111809 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.111782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b62c0f7-d80f-4a49-849f-bee46c4fc233-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.111978 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.111834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p8nw\" (UniqueName: \"kubernetes.io/projected/6b62c0f7-d80f-4a49-849f-bee46c4fc233-kube-api-access-9p8nw\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.213202 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.213147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b62c0f7-d80f-4a49-849f-bee46c4fc233-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.213380 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.213231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p8nw\" (UniqueName: \"kubernetes.io/projected/6b62c0f7-d80f-4a49-849f-bee46c4fc233-kube-api-access-9p8nw\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.213554 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.213533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b62c0f7-d80f-4a49-849f-bee46c4fc233-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.224752 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.224723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p8nw\" (UniqueName: \"kubernetes.io/projected/6b62c0f7-d80f-4a49-849f-bee46c4fc233-kube-api-access-9p8nw\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-d7jdw\" (UID: \"6b62c0f7-d80f-4a49-849f-bee46c4fc233\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.350502 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.350413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" Apr 22 13:26:40.493574 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.493543 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw"] Apr 22 13:26:40.496564 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:40.496538 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b62c0f7_d80f_4a49_849f_bee46c4fc233.slice/crio-86cf6353647226c93a798fc6d8681208c74b652aeed565c2713cce8841ffd9e8 WatchSource:0}: Error finding container 86cf6353647226c93a798fc6d8681208c74b652aeed565c2713cce8841ffd9e8: Status 404 returned error can't find the container with id 86cf6353647226c93a798fc6d8681208c74b652aeed565c2713cce8841ffd9e8 Apr 22 13:26:40.965453 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:40.965417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" event={"ID":"6b62c0f7-d80f-4a49-849f-bee46c4fc233","Type":"ContainerStarted","Data":"86cf6353647226c93a798fc6d8681208c74b652aeed565c2713cce8841ffd9e8"} Apr 22 13:26:42.973133 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:42.973097 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" event={"ID":"6b62c0f7-d80f-4a49-849f-bee46c4fc233","Type":"ContainerStarted","Data":"b6271b24c31acc010ba6ecb540909381eb5c28027bd4babc4035ad9ffaba5196"} Apr 22 13:26:43.015725 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:43.015660 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-d7jdw" podStartSLOduration=1.041759701 podStartE2EDuration="3.015640747s" podCreationTimestamp="2026-04-22 13:26:40 +0000 UTC" firstStartedPulling="2026-04-22 13:26:40.49903847 +0000 UTC m=+317.152142235" lastFinishedPulling="2026-04-22 13:26:42.472919518 +0000 UTC m=+319.126023281" observedRunningTime="2026-04-22 13:26:43.012772782 +0000 UTC m=+319.665876564" watchObservedRunningTime="2026-04-22 13:26:43.015640747 +0000 UTC m=+319.668744531" Apr 22 13:26:44.858310 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.858273 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wq8zg"] Apr 22 13:26:44.861472 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.861454 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:44.866241 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.866215 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 13:26:44.867007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.866990 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-5vdnc\"" Apr 22 13:26:44.867092 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.867001 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 13:26:44.878784 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.878759 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wq8zg"] Apr 22 13:26:44.947948 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.947904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:44.948128 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:44.947971 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7d4\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-kube-api-access-vl7d4\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.049067 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.049032 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.049261 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.049111 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7d4\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-kube-api-access-vl7d4\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.057845 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.057819 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.057961 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.057942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7d4\" (UniqueName: \"kubernetes.io/projected/7653d694-0559-4475-8b95-37a9cf739d1a-kube-api-access-vl7d4\") pod \"cert-manager-webhook-587ccfb98-wq8zg\" (UID: \"7653d694-0559-4475-8b95-37a9cf739d1a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.184108 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.184021 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:45.329852 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.329811 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wq8zg"] Apr 22 13:26:45.333710 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:45.333678 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7653d694_0559_4475_8b95_37a9cf739d1a.slice/crio-aad5fa9b8749b66af59fb56cbe19ddf341f986537d9ead3128aa955120944a84 WatchSource:0}: Error finding container aad5fa9b8749b66af59fb56cbe19ddf341f986537d9ead3128aa955120944a84: Status 404 returned error can't find the container with id aad5fa9b8749b66af59fb56cbe19ddf341f986537d9ead3128aa955120944a84 Apr 22 13:26:45.981587 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:45.981552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" event={"ID":"7653d694-0559-4475-8b95-37a9cf739d1a","Type":"ContainerStarted","Data":"aad5fa9b8749b66af59fb56cbe19ddf341f986537d9ead3128aa955120944a84"} Apr 22 13:26:47.026669 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.026626 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9tlt7"] Apr 22 13:26:47.029642 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.029626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.035332 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.035311 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8xvjv\"" Apr 22 13:26:47.046329 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.046308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9tlt7"] Apr 22 13:26:47.167713 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.167680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.167889 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.167741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482kh\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-kube-api-access-482kh\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.268262 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.268224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-482kh\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-kube-api-access-482kh\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.268462 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.268323 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.278007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.277941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-482kh\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-kube-api-access-482kh\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.278007 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.277965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57c11b4-9ea2-467b-986b-6d41a4f4cb32-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9tlt7\" (UID: \"f57c11b4-9ea2-467b-986b-6d41a4f4cb32\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.338138 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.338095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" Apr 22 13:26:47.479267 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.479229 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9tlt7"] Apr 22 13:26:47.481913 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:47.481881 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57c11b4_9ea2_467b_986b_6d41a4f4cb32.slice/crio-cbe6e79d824d0106ce3afea3ed9e6b5009dae41d861a802187d6bf92c299bc0d WatchSource:0}: Error finding container cbe6e79d824d0106ce3afea3ed9e6b5009dae41d861a802187d6bf92c299bc0d: Status 404 returned error can't find the container with id cbe6e79d824d0106ce3afea3ed9e6b5009dae41d861a802187d6bf92c299bc0d Apr 22 13:26:47.988848 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:47.988811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" event={"ID":"f57c11b4-9ea2-467b-986b-6d41a4f4cb32","Type":"ContainerStarted","Data":"cbe6e79d824d0106ce3afea3ed9e6b5009dae41d861a802187d6bf92c299bc0d"} Apr 22 13:26:48.993097 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:48.993058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" event={"ID":"7653d694-0559-4475-8b95-37a9cf739d1a","Type":"ContainerStarted","Data":"502ace6d0f82f595f056bca75555ad84a65b6168ea6f56ba6b828c889e680e34"} Apr 22 13:26:48.993539 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:48.993150 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:48.994323 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:48.994304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" event={"ID":"f57c11b4-9ea2-467b-986b-6d41a4f4cb32","Type":"ContainerStarted","Data":"acef86d30523f42c4edfd3becc2a0a0bbc428a0e044a39a72ea6cfc7ee1de51d"} Apr 22 13:26:49.013696 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:49.013651 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" podStartSLOduration=1.9321257539999999 podStartE2EDuration="5.013639014s" podCreationTimestamp="2026-04-22 13:26:44 +0000 UTC" firstStartedPulling="2026-04-22 13:26:45.335559501 +0000 UTC m=+321.988663260" lastFinishedPulling="2026-04-22 13:26:48.417072751 +0000 UTC m=+325.070176520" observedRunningTime="2026-04-22 13:26:49.011848139 +0000 UTC m=+325.664951931" watchObservedRunningTime="2026-04-22 13:26:49.013639014 +0000 UTC m=+325.666742795" Apr 22 13:26:49.036308 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:49.036265 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-9tlt7" podStartSLOduration=1.10258104 podStartE2EDuration="2.036254083s" podCreationTimestamp="2026-04-22 13:26:47 +0000 UTC" firstStartedPulling="2026-04-22 13:26:47.484140305 +0000 UTC m=+324.137244068" lastFinishedPulling="2026-04-22 13:26:48.417813347 +0000 UTC m=+325.070917111" observedRunningTime="2026-04-22 13:26:49.036076638 +0000 UTC m=+325.689180419" watchObservedRunningTime="2026-04-22 13:26:49.036254083 +0000 UTC m=+325.689357864" Apr 22 13:26:54.999722 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:54.999689 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-wq8zg" Apr 22 13:26:56.023918 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.023877 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-r625m"] Apr 22 13:26:56.030849 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.030824 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.033075 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.033052 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vd7pv\"" Apr 22 13:26:56.038923 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.038900 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-r625m"] Apr 22 13:26:56.139207 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.139148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4cf\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-kube-api-access-5w4cf\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.139378 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.139268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-bound-sa-token\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.240664 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.240633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-bound-sa-token\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.240789 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.240705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4cf\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-kube-api-access-5w4cf\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.248779 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.248746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4cf\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-kube-api-access-5w4cf\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.249653 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.249633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4862e7c8-681c-45a0-8e32-3751d638f85a-bound-sa-token\") pod \"cert-manager-79c8d999ff-r625m\" (UID: \"4862e7c8-681c-45a0-8e32-3751d638f85a\") " pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.340869 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.340783 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-r625m" Apr 22 13:26:56.456306 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:56.456283 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-r625m"] Apr 22 13:26:56.458754 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:56.458731 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4862e7c8_681c_45a0_8e32_3751d638f85a.slice/crio-427d42bcb795e548d1cb7ad1dc5ca67d8ab39c9d77eff3f1a5227c9e794c431a WatchSource:0}: Error finding container 427d42bcb795e548d1cb7ad1dc5ca67d8ab39c9d77eff3f1a5227c9e794c431a: Status 404 returned error can't find the container with id 427d42bcb795e548d1cb7ad1dc5ca67d8ab39c9d77eff3f1a5227c9e794c431a Apr 22 13:26:57.019177 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.019127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-r625m" event={"ID":"4862e7c8-681c-45a0-8e32-3751d638f85a","Type":"ContainerStarted","Data":"c6d2169263ade784edff690000083bfdf3f4ad37a5bab9ac1073370d55bd34e3"} Apr 22 13:26:57.019177 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.019161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-r625m" event={"ID":"4862e7c8-681c-45a0-8e32-3751d638f85a","Type":"ContainerStarted","Data":"427d42bcb795e548d1cb7ad1dc5ca67d8ab39c9d77eff3f1a5227c9e794c431a"} Apr 22 13:26:57.047917 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.047866 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-r625m" podStartSLOduration=1.047852506 podStartE2EDuration="1.047852506s" podCreationTimestamp="2026-04-22 13:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:26:57.045974118 +0000 UTC m=+333.699077898" watchObservedRunningTime="2026-04-22 13:26:57.047852506 +0000 UTC m=+333.700956287" Apr 22 13:26:57.583116 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.583081 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp"] Apr 22 13:26:57.586538 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.586521 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.588767 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.588747 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 13:26:57.589541 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.589525 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 13:26:57.589616 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.589528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tjz2s\"" Apr 22 13:26:57.602417 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.602392 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp"] Apr 22 13:26:57.755120 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.755085 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.755120 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.755123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.755345 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.755205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnnz\" (UniqueName: \"kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.856288 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.856219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.856288 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.856255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnnz\" (UniqueName: \"kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.856448 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.856321 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.856649 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.856628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.856686 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.856670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.864855 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.864825 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnnz\" (UniqueName: \"kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:57.895790 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:57.895770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:26:58.025796 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:58.025758 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp"] Apr 22 13:26:58.028490 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:26:58.028463 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49487ce_0a6b_4858_9000_ca8bf28358cc.slice/crio-6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332 WatchSource:0}: Error finding container 6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332: Status 404 returned error can't find the container with id 6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332 Apr 22 13:26:59.028244 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:59.028207 2567 generic.go:358] "Generic (PLEG): container finished" podID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerID="dd8325561b07185e74e9e5998b8a499a979d37d94daba65980c6595630b22e4e" exitCode=0 Apr 22 13:26:59.028596 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:59.028267 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" event={"ID":"f49487ce-0a6b-4858-9000-ca8bf28358cc","Type":"ContainerDied","Data":"dd8325561b07185e74e9e5998b8a499a979d37d94daba65980c6595630b22e4e"} Apr 22 13:26:59.028596 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:26:59.028288 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" event={"ID":"f49487ce-0a6b-4858-9000-ca8bf28358cc","Type":"ContainerStarted","Data":"6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332"} Apr 22 13:27:22.101991 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:22.101958 2567 generic.go:358] "Generic (PLEG): container finished" podID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerID="b57ff6e2c9ca4b7f86006f7170c394e0df7c04f8daac2d98b781138264c3797c" exitCode=0 Apr 22 13:27:22.102373 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:22.102029 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" event={"ID":"f49487ce-0a6b-4858-9000-ca8bf28358cc","Type":"ContainerDied","Data":"b57ff6e2c9ca4b7f86006f7170c394e0df7c04f8daac2d98b781138264c3797c"} Apr 22 13:27:23.107081 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:23.107045 2567 generic.go:358] "Generic (PLEG): container finished" podID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerID="fb22c9bcc9e3448ed97d7fb15772ad52791b5e74e4084d54173051e83539639d" exitCode=0 Apr 22 13:27:23.107534 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:23.107135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" event={"ID":"f49487ce-0a6b-4858-9000-ca8bf28358cc","Type":"ContainerDied","Data":"fb22c9bcc9e3448ed97d7fb15772ad52791b5e74e4084d54173051e83539639d"} Apr 22 13:27:24.226488 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.226466 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:27:24.261465 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.261436 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle\") pod \"f49487ce-0a6b-4858-9000-ca8bf28358cc\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " Apr 22 13:27:24.261590 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.261494 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util\") pod \"f49487ce-0a6b-4858-9000-ca8bf28358cc\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " Apr 22 13:27:24.261590 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.261532 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xnnz\" (UniqueName: \"kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz\") pod \"f49487ce-0a6b-4858-9000-ca8bf28358cc\" (UID: \"f49487ce-0a6b-4858-9000-ca8bf28358cc\") " Apr 22 13:27:24.261821 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.261793 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle" (OuterVolumeSpecName: "bundle") pod "f49487ce-0a6b-4858-9000-ca8bf28358cc" (UID: "f49487ce-0a6b-4858-9000-ca8bf28358cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 13:27:24.263573 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.263550 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz" (OuterVolumeSpecName: "kube-api-access-7xnnz") pod "f49487ce-0a6b-4858-9000-ca8bf28358cc" (UID: "f49487ce-0a6b-4858-9000-ca8bf28358cc"). InnerVolumeSpecName "kube-api-access-7xnnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:27:24.266117 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.266093 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util" (OuterVolumeSpecName: "util") pod "f49487ce-0a6b-4858-9000-ca8bf28358cc" (UID: "f49487ce-0a6b-4858-9000-ca8bf28358cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 13:27:24.362850 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.362758 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-util\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:27:24.362850 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.362800 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xnnz\" (UniqueName: \"kubernetes.io/projected/f49487ce-0a6b-4858-9000-ca8bf28358cc-kube-api-access-7xnnz\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:27:24.362850 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:24.362813 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f49487ce-0a6b-4858-9000-ca8bf28358cc-bundle\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:27:25.116209 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:25.116153 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" event={"ID":"f49487ce-0a6b-4858-9000-ca8bf28358cc","Type":"ContainerDied","Data":"6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332"} Apr 22 13:27:25.116209 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:25.116215 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e7449b116abb2ede1303ef02790f21b57d828547e53f8dad998162523eb3332" Apr 22 13:27:25.116425 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:25.116190 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exkjrp" Apr 22 13:27:29.746425 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746391 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk"] Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746662 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="extract" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746674 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="extract" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746682 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="pull" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746688 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="pull" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746703 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="util" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746708 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="util" Apr 22 13:27:29.746874 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.746772 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49487ce-0a6b-4858-9000-ca8bf28358cc" containerName="extract" Apr 22 13:27:29.750684 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.750668 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.753359 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.753335 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 22 13:27:29.753492 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.753446 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-vjxt9\"" Apr 22 13:27:29.754040 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.754026 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:27:29.759191 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.759153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk"] Apr 22 13:27:29.799077 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.799044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2wg\" (UniqueName: \"kubernetes.io/projected/2e2ee773-5689-44fb-8fb0-cfdb5d688559-kube-api-access-nr2wg\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.799229 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.799079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2ee773-5689-44fb-8fb0-cfdb5d688559-tmp\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.900115 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.900080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2wg\" (UniqueName: \"kubernetes.io/projected/2e2ee773-5689-44fb-8fb0-cfdb5d688559-kube-api-access-nr2wg\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.900290 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.900123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2ee773-5689-44fb-8fb0-cfdb5d688559-tmp\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.900519 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.900500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2ee773-5689-44fb-8fb0-cfdb5d688559-tmp\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:29.912250 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:29.912223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2wg\" (UniqueName: \"kubernetes.io/projected/2e2ee773-5689-44fb-8fb0-cfdb5d688559-kube-api-access-nr2wg\") pod \"jobset-operator-747c5859c7-4bdjk\" (UID: \"2e2ee773-5689-44fb-8fb0-cfdb5d688559\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:30.059802 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:30.059768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" Apr 22 13:27:30.184515 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:30.184491 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk"] Apr 22 13:27:30.186888 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:27:30.186855 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e2ee773_5689_44fb_8fb0_cfdb5d688559.slice/crio-182aeb4c1f530d43ca28f4080d161662f966e852350711919e5a1f3c262723b7 WatchSource:0}: Error finding container 182aeb4c1f530d43ca28f4080d161662f966e852350711919e5a1f3c262723b7: Status 404 returned error can't find the container with id 182aeb4c1f530d43ca28f4080d161662f966e852350711919e5a1f3c262723b7 Apr 22 13:27:31.137743 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:31.137697 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" event={"ID":"2e2ee773-5689-44fb-8fb0-cfdb5d688559","Type":"ContainerStarted","Data":"182aeb4c1f530d43ca28f4080d161662f966e852350711919e5a1f3c262723b7"} Apr 22 13:27:32.142985 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:32.142957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" event={"ID":"2e2ee773-5689-44fb-8fb0-cfdb5d688559","Type":"ContainerStarted","Data":"bfeb476e761e4e79d38eada96c9bc118f264716a81e2f25a5f31c6db668d7f18"} Apr 22 13:27:32.162937 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:32.162875 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4bdjk" podStartSLOduration=1.2807962019999999 podStartE2EDuration="3.162857567s" podCreationTimestamp="2026-04-22 13:27:29 +0000 UTC" firstStartedPulling="2026-04-22 13:27:30.188336641 +0000 UTC m=+366.841440400" lastFinishedPulling="2026-04-22 13:27:32.070397998 +0000 UTC m=+368.723501765" observedRunningTime="2026-04-22 13:27:32.160639803 +0000 UTC m=+368.813743612" watchObservedRunningTime="2026-04-22 13:27:32.162857567 +0000 UTC m=+368.815961352" Apr 22 13:27:58.853591 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.853554 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f"] Apr 22 13:27:58.860693 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.860669 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:58.862851 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.862830 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 22 13:27:58.863080 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.863057 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 13:27:58.863080 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.863075 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 22 13:27:58.863681 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.863653 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-vbbbq\"" Apr 22 13:27:58.863765 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.863710 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 13:27:58.865943 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.865923 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f"] Apr 22 13:27:58.935431 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.935398 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2d70830-0860-467c-918f-48559b25bef3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:58.935600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.935453 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpgbk\" (UniqueName: \"kubernetes.io/projected/a2d70830-0860-467c-918f-48559b25bef3-kube-api-access-hpgbk\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:58.935600 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:58.935513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a2d70830-0860-467c-918f-48559b25bef3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.036912 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.036873 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpgbk\" (UniqueName: \"kubernetes.io/projected/a2d70830-0860-467c-918f-48559b25bef3-kube-api-access-hpgbk\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.037080 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.036930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a2d70830-0860-467c-918f-48559b25bef3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.037080 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.036980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2d70830-0860-467c-918f-48559b25bef3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.037686 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.037662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a2d70830-0860-467c-918f-48559b25bef3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.039404 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.039382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2d70830-0860-467c-918f-48559b25bef3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.044436 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.044412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpgbk\" (UniqueName: \"kubernetes.io/projected/a2d70830-0860-467c-918f-48559b25bef3-kube-api-access-hpgbk\") pod \"kubeflow-trainer-controller-manager-55f5694779-5fr9f\" (UID: \"a2d70830-0860-467c-918f-48559b25bef3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.170586 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.170501 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:27:59.290897 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:27:59.290726 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f"] Apr 22 13:27:59.293590 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:27:59.293554 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d70830_0860_467c_918f_48559b25bef3.slice/crio-d8a55bcfa21726fe5499537d89c896972200c4b90fd1530b5b1b702024dfb78d WatchSource:0}: Error finding container d8a55bcfa21726fe5499537d89c896972200c4b90fd1530b5b1b702024dfb78d: Status 404 returned error can't find the container with id d8a55bcfa21726fe5499537d89c896972200c4b90fd1530b5b1b702024dfb78d Apr 22 13:28:00.229796 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:28:00.229749 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" event={"ID":"a2d70830-0860-467c-918f-48559b25bef3","Type":"ContainerStarted","Data":"d8a55bcfa21726fe5499537d89c896972200c4b90fd1530b5b1b702024dfb78d"} Apr 22 13:28:02.238343 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:28:02.238306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" event={"ID":"a2d70830-0860-467c-918f-48559b25bef3","Type":"ContainerStarted","Data":"d0fac3053d2b4be9fd8b19f32125c99da04a094facc11af15a7524a90691f3f9"} Apr 22 13:28:02.238829 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:28:02.238366 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:28:02.254330 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:28:02.254279 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" podStartSLOduration=2.08191334 podStartE2EDuration="4.254263216s" podCreationTimestamp="2026-04-22 13:27:58 +0000 UTC" firstStartedPulling="2026-04-22 13:27:59.295209105 +0000 UTC m=+395.948312864" lastFinishedPulling="2026-04-22 13:28:01.467558981 +0000 UTC m=+398.120662740" observedRunningTime="2026-04-22 13:28:02.252743192 +0000 UTC m=+398.905846973" watchObservedRunningTime="2026-04-22 13:28:02.254263216 +0000 UTC m=+398.907366996" Apr 22 13:28:18.247034 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:28:18.247007 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5fr9f" Apr 22 13:29:48.661833 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.661799 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss"] Apr 22 13:29:48.664981 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.664961 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:29:48.667125 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.667104 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-c5gp4\"/\"kube-root-ca.crt\"" Apr 22 13:29:48.667828 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.667810 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-c5gp4\"/\"default-dockercfg-xx8ks\"" Apr 22 13:29:48.667930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.667826 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-c5gp4\"/\"openshift-service-ca.crt\"" Apr 22 13:29:48.680377 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.680357 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss"] Apr 22 13:29:48.759434 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.759404 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp485\" (UniqueName: \"kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485\") pod \"test-trainjob-hcdvj-node-0-0-hnvss\" (UID: \"de71d716-5be8-40e3-8f93-b768ee6635ea\") " pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:29:48.860512 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.860479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp485\" (UniqueName: \"kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485\") pod \"test-trainjob-hcdvj-node-0-0-hnvss\" (UID: \"de71d716-5be8-40e3-8f93-b768ee6635ea\") " pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:29:48.868884 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.868857 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp485\" (UniqueName: \"kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485\") pod \"test-trainjob-hcdvj-node-0-0-hnvss\" (UID: \"de71d716-5be8-40e3-8f93-b768ee6635ea\") " pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:29:48.973844 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:48.973772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:29:49.095907 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:49.091783 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss"] Apr 22 13:29:49.095907 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:29:49.095811 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde71d716_5be8_40e3_8f93_b768ee6635ea.slice/crio-e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7 WatchSource:0}: Error finding container e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7: Status 404 returned error can't find the container with id e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7 Apr 22 13:29:49.570251 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:29:49.570208 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" event={"ID":"de71d716-5be8-40e3-8f93-b768ee6635ea","Type":"ContainerStarted","Data":"e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7"} Apr 22 13:31:23.851396 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:31:23.851365 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:31:23.852834 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:31:23.852807 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:34:23.520696 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:23.520658 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" event={"ID":"de71d716-5be8-40e3-8f93-b768ee6635ea","Type":"ContainerStarted","Data":"603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e"} Apr 22 13:34:23.545687 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:23.545635 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" podStartSLOduration=1.907487187 podStartE2EDuration="4m35.545622535s" podCreationTimestamp="2026-04-22 13:29:48 +0000 UTC" firstStartedPulling="2026-04-22 13:29:49.098726261 +0000 UTC m=+505.751830023" lastFinishedPulling="2026-04-22 13:34:22.736861599 +0000 UTC m=+779.389965371" observedRunningTime="2026-04-22 13:34:23.545473752 +0000 UTC m=+780.198577534" watchObservedRunningTime="2026-04-22 13:34:23.545622535 +0000 UTC m=+780.198726316" Apr 22 13:34:28.537904 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:28.537871 2567 generic.go:358] "Generic (PLEG): container finished" podID="de71d716-5be8-40e3-8f93-b768ee6635ea" containerID="603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e" exitCode=0 Apr 22 13:34:28.538332 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:28.537945 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" event={"ID":"de71d716-5be8-40e3-8f93-b768ee6635ea","Type":"ContainerDied","Data":"603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e"} Apr 22 13:34:29.667902 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:29.667879 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:34:29.742438 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:29.742405 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp485\" (UniqueName: \"kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485\") pod \"de71d716-5be8-40e3-8f93-b768ee6635ea\" (UID: \"de71d716-5be8-40e3-8f93-b768ee6635ea\") " Apr 22 13:34:29.744501 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:29.744475 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485" (OuterVolumeSpecName: "kube-api-access-bp485") pod "de71d716-5be8-40e3-8f93-b768ee6635ea" (UID: "de71d716-5be8-40e3-8f93-b768ee6635ea"). InnerVolumeSpecName "kube-api-access-bp485". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:34:29.843721 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:29.843645 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bp485\" (UniqueName: \"kubernetes.io/projected/de71d716-5be8-40e3-8f93-b768ee6635ea-kube-api-access-bp485\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:34:30.545478 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.545445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" event={"ID":"de71d716-5be8-40e3-8f93-b768ee6635ea","Type":"ContainerDied","Data":"e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7"} Apr 22 13:34:30.545478 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.545477 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83c1b70f5f481e412916cab09b5095bfd3b9b65d3e35de511d3c546cff429b7" Apr 22 13:34:30.545675 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.545483 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss" Apr 22 13:34:30.965120 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.965035 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf"] Apr 22 13:34:30.965603 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.965456 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de71d716-5be8-40e3-8f93-b768ee6635ea" containerName="node" Apr 22 13:34:30.965603 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.965474 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de71d716-5be8-40e3-8f93-b768ee6635ea" containerName="node" Apr 22 13:34:30.965603 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:30.965564 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="de71d716-5be8-40e3-8f93-b768ee6635ea" containerName="node" Apr 22 13:34:31.155526 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.155481 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf"] Apr 22 13:34:31.155710 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.155614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:34:31.158362 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.158339 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-8csd6\"/\"default-dockercfg-t9r6l\"" Apr 22 13:34:31.158507 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.158404 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-8csd6\"/\"kube-root-ca.crt\"" Apr 22 13:34:31.158935 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.158919 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-8csd6\"/\"openshift-service-ca.crt\"" Apr 22 13:34:31.253999 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.253916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8dm\" (UniqueName: \"kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm\") pod \"test-trainjob-9d22l-node-0-0-lxdlf\" (UID: \"4f40fd81-dd11-4284-8ecf-e1ab02296c11\") " pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:34:31.354628 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.354596 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8dm\" (UniqueName: \"kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm\") pod \"test-trainjob-9d22l-node-0-0-lxdlf\" (UID: \"4f40fd81-dd11-4284-8ecf-e1ab02296c11\") " pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:34:31.362865 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.362833 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8dm\" (UniqueName: \"kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm\") pod \"test-trainjob-9d22l-node-0-0-lxdlf\" (UID: \"4f40fd81-dd11-4284-8ecf-e1ab02296c11\") " pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:34:31.463867 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.463836 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:34:31.580148 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.580124 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf"] Apr 22 13:34:31.582602 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:34:31.582571 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f40fd81_dd11_4284_8ecf_e1ab02296c11.slice/crio-33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d WatchSource:0}: Error finding container 33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d: Status 404 returned error can't find the container with id 33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d Apr 22 13:34:31.584650 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:31.584634 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:34:32.557467 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:34:32.557429 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" event={"ID":"4f40fd81-dd11-4284-8ecf-e1ab02296c11","Type":"ContainerStarted","Data":"33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d"} Apr 22 13:36:23.877965 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:36:23.877929 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:36:23.880648 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:36:23.880627 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:38:40.429453 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:40.429414 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" event={"ID":"4f40fd81-dd11-4284-8ecf-e1ab02296c11","Type":"ContainerStarted","Data":"8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03"} Apr 22 13:38:40.456364 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:40.456314 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" podStartSLOduration=2.287517974 podStartE2EDuration="4m10.456295787s" podCreationTimestamp="2026-04-22 13:34:30 +0000 UTC" firstStartedPulling="2026-04-22 13:34:31.584806702 +0000 UTC m=+788.237910475" lastFinishedPulling="2026-04-22 13:38:39.753584524 +0000 UTC m=+1036.406688288" observedRunningTime="2026-04-22 13:38:40.455380325 +0000 UTC m=+1037.108484106" watchObservedRunningTime="2026-04-22 13:38:40.456295787 +0000 UTC m=+1037.109399569" Apr 22 13:38:46.451729 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:46.451692 2567 generic.go:358] "Generic (PLEG): container finished" podID="4f40fd81-dd11-4284-8ecf-e1ab02296c11" containerID="8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03" exitCode=0 Apr 22 13:38:46.452224 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:46.451769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" event={"ID":"4f40fd81-dd11-4284-8ecf-e1ab02296c11","Type":"ContainerDied","Data":"8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03"} Apr 22 13:38:47.749821 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:47.749794 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:38:47.830050 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:47.830013 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq8dm\" (UniqueName: \"kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm\") pod \"4f40fd81-dd11-4284-8ecf-e1ab02296c11\" (UID: \"4f40fd81-dd11-4284-8ecf-e1ab02296c11\") " Apr 22 13:38:47.832186 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:47.832135 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm" (OuterVolumeSpecName: "kube-api-access-bq8dm") pod "4f40fd81-dd11-4284-8ecf-e1ab02296c11" (UID: "4f40fd81-dd11-4284-8ecf-e1ab02296c11"). InnerVolumeSpecName "kube-api-access-bq8dm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:38:47.930601 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:47.930565 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq8dm\" (UniqueName: \"kubernetes.io/projected/4f40fd81-dd11-4284-8ecf-e1ab02296c11-kube-api-access-bq8dm\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:38:48.459005 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:48.458974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" event={"ID":"4f40fd81-dd11-4284-8ecf-e1ab02296c11","Type":"ContainerDied","Data":"33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d"} Apr 22 13:38:48.459005 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:48.459006 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f27fdecace48bf8f7b2f0d14a110853de8243fe349aaccf28805fbd3a8ee8d" Apr 22 13:38:48.459005 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:48.458990 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf" Apr 22 13:38:49.168071 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.168034 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv"] Apr 22 13:38:49.168475 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.168347 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f40fd81-dd11-4284-8ecf-e1ab02296c11" containerName="node" Apr 22 13:38:49.168475 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.168358 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f40fd81-dd11-4284-8ecf-e1ab02296c11" containerName="node" Apr 22 13:38:49.168475 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.168400 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f40fd81-dd11-4284-8ecf-e1ab02296c11" containerName="node" Apr 22 13:38:49.572298 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.572265 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv"] Apr 22 13:38:49.572468 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.572371 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:38:49.575154 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.575128 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mb28x\"/\"openshift-service-ca.crt\"" Apr 22 13:38:49.575291 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.575180 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-mb28x\"/\"kube-root-ca.crt\"" Apr 22 13:38:49.575642 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.575626 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-mb28x\"/\"default-dockercfg-8vvlb\"" Apr 22 13:38:49.643746 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.643718 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tfz\" (UniqueName: \"kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz\") pod \"test-trainjob-9jmlw-node-0-0-zr6qv\" (UID: \"26c1adff-3288-44e8-8971-8f979531d0e4\") " pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:38:49.745063 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.745017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tfz\" (UniqueName: \"kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz\") pod \"test-trainjob-9jmlw-node-0-0-zr6qv\" (UID: \"26c1adff-3288-44e8-8971-8f979531d0e4\") " pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:38:49.753922 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.753883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tfz\" (UniqueName: \"kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz\") pod \"test-trainjob-9jmlw-node-0-0-zr6qv\" (UID: \"26c1adff-3288-44e8-8971-8f979531d0e4\") " pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:38:49.881493 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:49.881414 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:38:50.134630 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:50.134609 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv"] Apr 22 13:38:50.139801 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:38:50.139378 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c1adff_3288_44e8_8971_8f979531d0e4.slice/crio-5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a WatchSource:0}: Error finding container 5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a: Status 404 returned error can't find the container with id 5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a Apr 22 13:38:50.471103 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:38:50.471032 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" event={"ID":"26c1adff-3288-44e8-8971-8f979531d0e4","Type":"ContainerStarted","Data":"5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a"} Apr 22 13:39:17.664658 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:17.664559 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:39:42.686204 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:42.686142 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d979f9785-8m4vg" podUID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" containerName="console" containerID="cri-o://c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e" gracePeriod=15 Apr 22 13:39:42.967901 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:42.967877 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d979f9785-8m4vg_b4832ef4-7313-458a-9ed5-3ff6e466d3eb/console/0.log" Apr 22 13:39:42.968023 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:42.967939 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:39:43.091226 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091189 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091236 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091262 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091289 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091337 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091394 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091410 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091412 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm2wr\" (UniqueName: \"kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr\") pod \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\" (UID: \"b4832ef4-7313-458a-9ed5-3ff6e466d3eb\") " Apr 22 13:39:43.091827 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091733 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:39:43.091924 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091815 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config" (OuterVolumeSpecName: "console-config") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:39:43.091924 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091748 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:39:43.092020 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.091917 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca" (OuterVolumeSpecName: "service-ca") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:39:43.093802 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.093767 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:39:43.093930 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.093864 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:39:43.093992 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.093926 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr" (OuterVolumeSpecName: "kube-api-access-jm2wr") pod "b4832ef4-7313-458a-9ed5-3ff6e466d3eb" (UID: "b4832ef4-7313-458a-9ed5-3ff6e466d3eb"). InnerVolumeSpecName "kube-api-access-jm2wr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:39:43.192818 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192780 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-oauth-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.192818 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192814 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-service-ca\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.192818 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192827 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jm2wr\" (UniqueName: \"kubernetes.io/projected/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-kube-api-access-jm2wr\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.193055 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192836 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.193055 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192845 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-oauth-serving-cert\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.193055 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192854 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-console-config\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.193055 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.192863 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4832ef4-7313-458a-9ed5-3ff6e466d3eb-trusted-ca-bundle\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:39:43.665501 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d979f9785-8m4vg_b4832ef4-7313-458a-9ed5-3ff6e466d3eb/console/0.log" Apr 22 13:39:43.665665 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665516 2567 generic.go:358] "Generic (PLEG): container finished" podID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" containerID="c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e" exitCode=2 Apr 22 13:39:43.665665 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665582 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d979f9785-8m4vg" Apr 22 13:39:43.665665 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665597 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d979f9785-8m4vg" event={"ID":"b4832ef4-7313-458a-9ed5-3ff6e466d3eb","Type":"ContainerDied","Data":"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e"} Apr 22 13:39:43.665665 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d979f9785-8m4vg" event={"ID":"b4832ef4-7313-458a-9ed5-3ff6e466d3eb","Type":"ContainerDied","Data":"87632dda8be4dc4fe0ec955476182f9cea8b49cb5c338722884ca0c4b6861f59"} Apr 22 13:39:43.665665 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.665650 2567 scope.go:117] "RemoveContainer" containerID="c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e" Apr 22 13:39:43.680824 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.680809 2567 scope.go:117] "RemoveContainer" containerID="c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e" Apr 22 13:39:43.681062 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:39:43.681045 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e\": container with ID starting with c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e not found: ID does not exist" containerID="c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e" Apr 22 13:39:43.681100 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.681070 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e"} err="failed to get container status \"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e\": rpc error: code = NotFound desc = could not find container \"c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e\": container with ID starting with c9bb5e1f6e39578865a040b9a0091ecfd42d9488be7e1d9dfecc0c18755df26e not found: ID does not exist" Apr 22 13:39:43.688222 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.688199 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:39:43.692213 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.692187 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d979f9785-8m4vg"] Apr 22 13:39:43.950454 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:39:43.950359 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" path="/var/lib/kubelet/pods/b4832ef4-7313-458a-9ed5-3ff6e466d3eb/volumes" Apr 22 13:40:10.765206 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:10.765150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" event={"ID":"26c1adff-3288-44e8-8971-8f979531d0e4","Type":"ContainerStarted","Data":"f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d"} Apr 22 13:40:10.782378 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:10.782323 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" podStartSLOduration=2.142802351 podStartE2EDuration="1m21.782307172s" podCreationTimestamp="2026-04-22 13:38:49 +0000 UTC" firstStartedPulling="2026-04-22 13:38:50.141386097 +0000 UTC m=+1046.794489862" lastFinishedPulling="2026-04-22 13:40:09.780890912 +0000 UTC m=+1126.433994683" observedRunningTime="2026-04-22 13:40:10.78162497 +0000 UTC m=+1127.434728753" watchObservedRunningTime="2026-04-22 13:40:10.782307172 +0000 UTC m=+1127.435411017" Apr 22 13:40:12.773176 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:12.773127 2567 generic.go:358] "Generic (PLEG): container finished" podID="26c1adff-3288-44e8-8971-8f979531d0e4" containerID="f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d" exitCode=0 Apr 22 13:40:12.773542 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:12.773204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" event={"ID":"26c1adff-3288-44e8-8971-8f979531d0e4","Type":"ContainerDied","Data":"f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d"} Apr 22 13:40:13.897686 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:13.897662 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:40:14.055785 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.055709 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8tfz\" (UniqueName: \"kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz\") pod \"26c1adff-3288-44e8-8971-8f979531d0e4\" (UID: \"26c1adff-3288-44e8-8971-8f979531d0e4\") " Apr 22 13:40:14.057856 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.057823 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz" (OuterVolumeSpecName: "kube-api-access-r8tfz") pod "26c1adff-3288-44e8-8971-8f979531d0e4" (UID: "26c1adff-3288-44e8-8971-8f979531d0e4"). InnerVolumeSpecName "kube-api-access-r8tfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:40:14.157187 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.157132 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8tfz\" (UniqueName: \"kubernetes.io/projected/26c1adff-3288-44e8-8971-8f979531d0e4-kube-api-access-r8tfz\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:40:14.780380 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.780352 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" Apr 22 13:40:14.780559 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.780381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv" event={"ID":"26c1adff-3288-44e8-8971-8f979531d0e4","Type":"ContainerDied","Data":"5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a"} Apr 22 13:40:14.780559 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:14.780413 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3b1d58c181622610e1db5199745be3a4101f6c0a6ae46f9c80a40e9384a53a" Apr 22 13:40:15.528408 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528380 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f"] Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528663 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" containerName="console" Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528675 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" containerName="console" Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528681 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26c1adff-3288-44e8-8971-8f979531d0e4" containerName="node" Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528686 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c1adff-3288-44e8-8971-8f979531d0e4" containerName="node" Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528745 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="26c1adff-3288-44e8-8971-8f979531d0e4" containerName="node" Apr 22 13:40:15.528788 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.528756 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4832ef4-7313-458a-9ed5-3ff6e466d3eb" containerName="console" Apr 22 13:40:15.642578 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.642531 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f"] Apr 22 13:40:15.642747 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.642649 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:40:15.645329 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.645290 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9xzm7\"/\"kube-root-ca.crt\"" Apr 22 13:40:15.645329 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.645305 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9xzm7\"/\"openshift-service-ca.crt\"" Apr 22 13:40:15.646105 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.646087 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-9xzm7\"/\"default-dockercfg-tqmm4\"" Apr 22 13:40:15.769511 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.769471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88v5\" (UniqueName: \"kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5\") pod \"test-trainjob-rhnpk-node-0-0-xbs4f\" (UID: \"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0\") " pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:40:15.869992 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.869899 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m88v5\" (UniqueName: \"kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5\") pod \"test-trainjob-rhnpk-node-0-0-xbs4f\" (UID: \"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0\") " pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:40:15.878554 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.878529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88v5\" (UniqueName: \"kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5\") pod \"test-trainjob-rhnpk-node-0-0-xbs4f\" (UID: \"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0\") " pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:40:15.952090 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:15.952062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:40:16.069515 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:16.069489 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f"] Apr 22 13:40:16.072046 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:40:16.072020 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5ddc6a7_beec_455b_99c1_2cbf9b7a0bd0.slice/crio-9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451 WatchSource:0}: Error finding container 9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451: Status 404 returned error can't find the container with id 9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451 Apr 22 13:40:16.073963 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:16.073948 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:40:16.787505 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:40:16.787465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" event={"ID":"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0","Type":"ContainerStarted","Data":"9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451"} Apr 22 13:41:23.904824 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:41:23.904785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:41:23.908956 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:41:23.908935 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:46:59.417655 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:46:59.417577 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:46:59.434399 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:46:59.418104 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:47:01.213581 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:01.213541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" event={"ID":"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0","Type":"ContainerStarted","Data":"6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a"} Apr 22 13:47:01.216101 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:01.216085 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-9xzm7\"/\"default-dockercfg-tqmm4\"" Apr 22 13:47:01.231512 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:01.231491 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9xzm7\"/\"kube-root-ca.crt\"" Apr 22 13:47:01.237321 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:01.237281 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" podStartSLOduration=2.000304275 podStartE2EDuration="6m46.237267296s" podCreationTimestamp="2026-04-22 13:40:15 +0000 UTC" firstStartedPulling="2026-04-22 13:40:16.074073569 +0000 UTC m=+1132.727177328" lastFinishedPulling="2026-04-22 13:47:00.311036587 +0000 UTC m=+1536.964140349" observedRunningTime="2026-04-22 13:47:01.235945051 +0000 UTC m=+1537.889048832" watchObservedRunningTime="2026-04-22 13:47:01.237267296 +0000 UTC m=+1537.890371076" Apr 22 13:47:01.242155 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:01.242136 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-9xzm7\"/\"openshift-service-ca.crt\"" Apr 22 13:47:05.228971 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:05.228938 2567 generic.go:358] "Generic (PLEG): container finished" podID="c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" containerID="6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a" exitCode=0 Apr 22 13:47:05.229452 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:05.229015 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" event={"ID":"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0","Type":"ContainerDied","Data":"6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a"} Apr 22 13:47:06.361238 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:06.361215 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:47:06.368430 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:06.368411 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m88v5\" (UniqueName: \"kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5\") pod \"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0\" (UID: \"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0\") " Apr 22 13:47:06.370472 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:06.370449 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5" (OuterVolumeSpecName: "kube-api-access-m88v5") pod "c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" (UID: "c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0"). InnerVolumeSpecName "kube-api-access-m88v5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:47:06.469309 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:06.469272 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m88v5\" (UniqueName: \"kubernetes.io/projected/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0-kube-api-access-m88v5\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 13:47:07.236396 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:07.236367 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" Apr 22 13:47:07.236563 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:07.236403 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f" event={"ID":"c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0","Type":"ContainerDied","Data":"9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451"} Apr 22 13:47:07.236563 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:07.236433 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9569907e11c071f5f99e01ea3898a7ba7591b5c5ed2cf7564829b784736c5451" Apr 22 13:47:08.404267 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.404231 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr"] Apr 22 13:47:08.404631 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.404556 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" containerName="node" Apr 22 13:47:08.404631 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.404567 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" containerName="node" Apr 22 13:47:08.404631 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.404613 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" containerName="node" Apr 22 13:47:08.427338 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.427306 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr"] Apr 22 13:47:08.427548 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.427408 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 13:47:08.429463 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.429442 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"kube-root-ca.crt\"" Apr 22 13:47:08.430246 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.430225 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"openshift-service-ca.crt\"" Apr 22 13:47:08.430352 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.430269 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-wwz8b\"/\"default-dockercfg-j75bt\"" Apr 22 13:47:08.484179 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.484142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnqb\" (UniqueName: \"kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb\") pod \"test-trainjob-l4mgr-node-0-0-7l2lr\" (UID: \"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af\") " pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 13:47:08.585400 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.585363 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnqb\" (UniqueName: \"kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb\") pod \"test-trainjob-l4mgr-node-0-0-7l2lr\" (UID: \"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af\") " pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 13:47:08.593732 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.593705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnqb\" (UniqueName: \"kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb\") pod \"test-trainjob-l4mgr-node-0-0-7l2lr\" (UID: \"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af\") " pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 13:47:08.736887 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.736802 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 13:47:08.853808 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.853777 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr"] Apr 22 13:47:08.856663 ip-10-0-142-133 kubenswrapper[2567]: W0422 13:47:08.856638 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3627d17b_1d54_4bfe_83f9_6eec9b3ba8af.slice/crio-68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac WatchSource:0}: Error finding container 68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac: Status 404 returned error can't find the container with id 68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac Apr 22 13:47:08.858558 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:08.858541 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:47:09.243902 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:47:09.243861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" event={"ID":"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af","Type":"ContainerStarted","Data":"68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac"} Apr 22 13:51:59.443968 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:51:59.443847 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:51:59.443968 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:51:59.443910 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:53:29.041251 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:53:29.041216 2567 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 22 13:53:29.041809 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:53:29.041278 2567 container_gc.go:86] "Attempting to delete unused containers" Apr 22 13:53:29.042709 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:53:29.042688 2567 scope.go:117] "RemoveContainer" containerID="6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a" Apr 22 13:53:31.692374 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:53:31.692340 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasDiskPressure" Apr 22 13:54:29.437434 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:54:29.437379 2567 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 22 13:54:29.437434 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:54:29.437436 2567 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 13:54:29.437948 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:54:29.437447 2567 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 13:55:29.043534 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:55:29.043478 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a" Apr 22 13:55:29.043534 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:55:29.043538 2567 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a" Apr 22 13:55:29.044054 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:55:29.043560 2567 scope.go:117] "RemoveContainer" containerID="f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d" Apr 22 13:56:59.438597 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:56:59.438536 2567 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 22 13:56:59.438597 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:56:59.438602 2567 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 13:56:59.439138 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:56:59.438616 2567 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 13:57:28.021883 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.021576 2567 scope.go:117] "RemoveContainer" containerID="603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e" Apr 22 13:57:28.047136 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.047119 2567 scope.go:117] "RemoveContainer" containerID="fb22c9bcc9e3448ed97d7fb15772ad52791b5e74e4084d54173051e83539639d" Apr 22 13:57:28.055802 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.055769 2567 scope.go:117] "RemoveContainer" containerID="b57ff6e2c9ca4b7f86006f7170c394e0df7c04f8daac2d98b781138264c3797c" Apr 22 13:57:28.067595 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.067574 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:57:28.067696 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.067604 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:57:28.094831 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.094808 2567 scope.go:117] "RemoveContainer" containerID="dd8325561b07185e74e9e5998b8a499a979d37d94daba65980c6595630b22e4e" Apr 22 13:57:28.101816 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.101795 2567 scope.go:117] "RemoveContainer" containerID="cc7ef61098248b1c2d2760ae936e1799cb9accb0a7f903e7b77329a5911134c8" Apr 22 13:57:28.138226 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.138203 2567 scope.go:117] "RemoveContainer" containerID="9fc005cd018b409050a9d754cdc2580ee1ec686fd61e8e00a3f388a0323c0585" Apr 22 13:57:28.145908 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.145885 2567 scope.go:117] "RemoveContainer" containerID="baedf63bce31ffd128ba6acec26ff8cd7babde4a503b2f21fbad923cf045f304" Apr 22 13:57:28.153180 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.153145 2567 scope.go:117] "RemoveContainer" containerID="8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03" Apr 22 13:57:28.190431 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.190410 2567 image_gc_manager.go:447] "Attempting to delete unused images" Apr 22 13:57:28.207827 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.207804 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 13:57:28.209209 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.209187 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 22 13:57:28.702538 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:57:28.702469 2567 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_ExperimentalDTree_Contraction_l_Ailk_Bjlk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 22 13:57:28.702749 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:57:28.702710 2567 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-l4mgr-node-0-0.test-trainjob-l4mgr,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tnqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-l4mgr-node-0-0-7l2lr_test-ns-wwz8b(3627d17b-1d54-4bfe-83f9-6eec9b3ba8af): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_ExperimentalDTree_Contraction_l_Ailk_Bjlk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 13:57:28.703942 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:57:28.703902 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_ExperimentalDTree_Contraction_l_Ailk_Bjlk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" Apr 22 13:57:28.729257 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.729232 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 22 13:57:28.787146 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.787077 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="8cfae5f12a3d5e8f5711d1531d223358c13a3d4b36be844d8c6890efdfa09339" size=622989096 runtimeHandler="" Apr 22 13:57:28.852240 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:28.852208 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 22 13:57:29.420985 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:29.420957 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-wwz8b\"/\"default-dockercfg-j75bt\"" Apr 22 13:57:29.452660 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:29.450396 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"kube-root-ca.crt\"" Apr 22 13:57:29.460622 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:29.460596 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"openshift-service-ca.crt\"" Apr 22 13:57:32.737796 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:32.737327 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="6891cc672306b725c5068715074791d001570e3be37e86cf1102543dae17aca6" size=7588072890 runtimeHandler="" Apr 22 13:57:32.737796 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:32.737674 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:57:32.738133 ip-10-0-142-133 kubenswrapper[2567]: E0422 13:57:32.737884 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_ExperimentalDTree_Contraction_l_Ailk_Bjlk_Cijk_Dijk_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" Apr 22 13:57:35.879076 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:35.879038 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 22 13:57:38.787594 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:38.787542 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="f9c8cb14b315efbe3847333b6de717d1a52318bb05b38cce743926641075fbb5" size=884076775 runtimeHandler="" Apr 22 13:57:38.812297 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:38.812264 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="ba0d5ab4eb24f99d84ae4923fefa85e3ab5042c1e554dcca3a41789529499633" size=107183730 runtimeHandler="" Apr 22 13:57:38.823587 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:38.823558 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="ec845ac5d8f1d4c74cbd447a93360fa7b8b615723fab3a377882708da6009878" size=977364430 runtimeHandler="" Apr 22 13:57:38.848111 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:38.848080 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="df7311fe93e730dc6d3d65a73c992b1583cc3d49b2e20975439f4718eb9ac4f5" size=108503547 runtimeHandler="" Apr 22 13:57:38.859584 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:38.859554 2567 image_gc_manager.go:514] "Removing image to free bytes" imageID="7e65b8288e37c3f4fac04e8bf51240765caae34795b317d44d5399762a08b761" size=23201654702 runtimeHandler="" Apr 22 13:57:43.011721 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:57:43.011693 2567 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 22 13:58:37.370629 ip-10-0-142-133 kubenswrapper[2567]: I0422 13:58:37.370545 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:02:28.086767 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:02:28.086737 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 14:02:28.089371 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:02:28.088284 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 14:06:08.887496 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:06:08.887334 2567 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 22 14:06:08.887496 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:06:08.887401 2567 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 14:06:08.887496 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:06:08.887416 2567 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 22 14:08:01.241942 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:01.241864 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 14:08:01.241942 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:01.241919 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 14:08:02.481367 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:02.481328 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" event={"ID":"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af","Type":"ContainerStarted","Data":"a0d4d07fe8e03e3bebdab9d24616183d29022c5152816a0278c5d8c3dfb64f4f"} Apr 22 14:08:02.484207 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:02.484185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-wwz8b\"/\"default-dockercfg-j75bt\"" Apr 22 14:08:02.510522 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:02.510466 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" podStartSLOduration=1.408435968 podStartE2EDuration="20m54.510450939s" podCreationTimestamp="2026-04-22 13:47:08 +0000 UTC" firstStartedPulling="2026-04-22 13:47:08.858665697 +0000 UTC m=+1545.511769455" lastFinishedPulling="2026-04-22 14:08:01.960680667 +0000 UTC m=+2798.613784426" observedRunningTime="2026-04-22 14:08:02.509250649 +0000 UTC m=+2799.162354441" watchObservedRunningTime="2026-04-22 14:08:02.510450939 +0000 UTC m=+2799.163554721" Apr 22 14:08:02.602546 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:02.602516 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"kube-root-ca.crt\"" Apr 22 14:08:02.612325 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:02.612301 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wwz8b\"/\"openshift-service-ca.crt\"" Apr 22 14:08:18.531977 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:18.531937 2567 generic.go:358] "Generic (PLEG): container finished" podID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" containerID="a0d4d07fe8e03e3bebdab9d24616183d29022c5152816a0278c5d8c3dfb64f4f" exitCode=0 Apr 22 14:08:18.532441 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:18.532010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" event={"ID":"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af","Type":"ContainerDied","Data":"a0d4d07fe8e03e3bebdab9d24616183d29022c5152816a0278c5d8c3dfb64f4f"} Apr 22 14:08:19.664144 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:19.664120 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 14:08:19.838876 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:19.838776 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnqb\" (UniqueName: \"kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb\") pod \"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af\" (UID: \"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af\") " Apr 22 14:08:19.840963 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:19.840927 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb" (OuterVolumeSpecName: "kube-api-access-4tnqb") pod "3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" (UID: "3627d17b-1d54-4bfe-83f9-6eec9b3ba8af"). InnerVolumeSpecName "kube-api-access-4tnqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:08:19.939253 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:19.939221 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tnqb\" (UniqueName: \"kubernetes.io/projected/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af-kube-api-access-4tnqb\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 14:08:20.539354 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:20.539328 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" Apr 22 14:08:20.539526 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:20.539320 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr" event={"ID":"3627d17b-1d54-4bfe-83f9-6eec9b3ba8af","Type":"ContainerDied","Data":"68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac"} Apr 22 14:08:20.539526 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:20.539446 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68285a181d724960d9c4da97875db2b65cfed086fd902c7a8a113aaf189751ac" Apr 22 14:08:20.730002 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:20.729973 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-wwz8b_test-trainjob-l4mgr-node-0-0-7l2lr_3627d17b-1d54-4bfe-83f9-6eec9b3ba8af/node/0.log" Apr 22 14:08:20.817824 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:08:20.817737 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a\": container with ID starting with 6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a not found: ID does not exist" containerID="6336f783d666221815ed7e9a885dcedf813fee4c63d1fba6b461098ad6fb2c3a" Apr 22 14:08:20.919062 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:08:20.919019 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d\": container with ID starting with f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d not found: ID does not exist" containerID="f1bb1496b4bce0764e87e9fddd3b7dcad06f00458381b72c0c4315cf39359e7d" Apr 22 14:08:21.020448 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:08:21.020411 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03\": container with ID starting with 8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03 not found: ID does not exist" containerID="8768e09054c1e4400116da2bd0a456d8a2e3666b2ddcf4333077f65f26c22c03" Apr 22 14:08:21.515503 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:08:21.515464 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e\": container with ID starting with 603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e not found: ID does not exist" containerID="603b73a7b8ef86758b54e78e1cc51ea7087e550b57863d1e2c42018b63dd1b9e" Apr 22 14:08:22.460946 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.460911 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jhrrk/must-gather-thtz7"] Apr 22 14:08:22.461331 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.461206 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" containerName="node" Apr 22 14:08:22.461331 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.461217 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" containerName="node" Apr 22 14:08:22.461331 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.461276 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" containerName="node" Apr 22 14:08:22.629269 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.629232 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jhrrk/must-gather-thtz7"] Apr 22 14:08:22.629442 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.629377 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.632036 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.632009 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jhrrk\"/\"openshift-service-ca.crt\"" Apr 22 14:08:22.632661 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.632636 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jhrrk\"/\"default-dockercfg-gg8zb\"" Apr 22 14:08:22.632661 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.632666 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jhrrk\"/\"kube-root-ca.crt\"" Apr 22 14:08:22.656731 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.656707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqzj7\" (UniqueName: \"kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.656850 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.656762 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.757765 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.757686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.757765 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.757736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqzj7\" (UniqueName: \"kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.758041 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.758020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.766844 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.766817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqzj7\" (UniqueName: \"kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7\") pod \"must-gather-thtz7\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:22.938817 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:22.938783 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:08:23.132508 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:23.132397 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jhrrk/must-gather-thtz7"] Apr 22 14:08:23.134855 ip-10-0-142-133 kubenswrapper[2567]: W0422 14:08:23.134821 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14cd1d5f_2f87_4dde_a626_e6197565c625.slice/crio-73b25ccc8d24a4479bd2334f3f962ea42cfda7fe27d38d9e9760995d4283a296 WatchSource:0}: Error finding container 73b25ccc8d24a4479bd2334f3f962ea42cfda7fe27d38d9e9760995d4283a296: Status 404 returned error can't find the container with id 73b25ccc8d24a4479bd2334f3f962ea42cfda7fe27d38d9e9760995d4283a296 Apr 22 14:08:23.136925 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:23.136906 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:08:23.550382 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:23.550350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrrk/must-gather-thtz7" event={"ID":"14cd1d5f-2f87-4dde-a626-e6197565c625","Type":"ContainerStarted","Data":"73b25ccc8d24a4479bd2334f3f962ea42cfda7fe27d38d9e9760995d4283a296"} Apr 22 14:08:25.763068 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.763036 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr"] Apr 22 14:08:25.766307 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.766282 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-wwz8b/test-trainjob-l4mgr-node-0-0-7l2lr"] Apr 22 14:08:25.864040 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.864000 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f"] Apr 22 14:08:25.868668 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.868640 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-9xzm7/test-trainjob-rhnpk-node-0-0-xbs4f"] Apr 22 14:08:25.949481 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.949445 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3627d17b-1d54-4bfe-83f9-6eec9b3ba8af" path="/var/lib/kubelet/pods/3627d17b-1d54-4bfe-83f9-6eec9b3ba8af/volumes" Apr 22 14:08:25.949898 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.949878 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0" path="/var/lib/kubelet/pods/c5ddc6a7-beec-455b-99c1-2cbf9b7a0bd0/volumes" Apr 22 14:08:25.964829 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.964793 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv"] Apr 22 14:08:25.969082 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:25.969054 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-mb28x/test-trainjob-9jmlw-node-0-0-zr6qv"] Apr 22 14:08:26.136346 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:26.136234 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf"] Apr 22 14:08:26.140248 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:26.140212 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-8csd6/test-trainjob-9d22l-node-0-0-lxdlf"] Apr 22 14:08:26.731379 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:26.731343 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss"] Apr 22 14:08:26.735542 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:26.735516 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-c5gp4/test-trainjob-hcdvj-node-0-0-hnvss"] Apr 22 14:08:27.954955 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:27.954919 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c1adff-3288-44e8-8971-8f979531d0e4" path="/var/lib/kubelet/pods/26c1adff-3288-44e8-8971-8f979531d0e4/volumes" Apr 22 14:08:27.955409 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:27.955321 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f40fd81-dd11-4284-8ecf-e1ab02296c11" path="/var/lib/kubelet/pods/4f40fd81-dd11-4284-8ecf-e1ab02296c11/volumes" Apr 22 14:08:27.955655 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:27.955637 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de71d716-5be8-40e3-8f93-b768ee6635ea" path="/var/lib/kubelet/pods/de71d716-5be8-40e3-8f93-b768ee6635ea/volumes" Apr 22 14:08:29.573476 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:29.573441 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrrk/must-gather-thtz7" event={"ID":"14cd1d5f-2f87-4dde-a626-e6197565c625","Type":"ContainerStarted","Data":"c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752"} Apr 22 14:08:29.573476 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:29.573476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrrk/must-gather-thtz7" event={"ID":"14cd1d5f-2f87-4dde-a626-e6197565c625","Type":"ContainerStarted","Data":"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c"} Apr 22 14:08:29.590568 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:29.590519 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jhrrk/must-gather-thtz7" podStartSLOduration=2.190952988 podStartE2EDuration="7.590497433s" podCreationTimestamp="2026-04-22 14:08:22 +0000 UTC" firstStartedPulling="2026-04-22 14:08:23.137034181 +0000 UTC m=+2819.790137939" lastFinishedPulling="2026-04-22 14:08:28.536578621 +0000 UTC m=+2825.189682384" observedRunningTime="2026-04-22 14:08:29.5887122 +0000 UTC m=+2826.241815994" watchObservedRunningTime="2026-04-22 14:08:29.590497433 +0000 UTC m=+2826.243601214" Apr 22 14:08:39.385414 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:39.385374 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5fr9f_a2d70830-0860-467c-918f-48559b25bef3/manager/0.log" Apr 22 14:08:39.831549 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:39.831499 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5fr9f_a2d70830-0860-467c-918f-48559b25bef3/manager/0.log" Apr 22 14:08:40.327864 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:08:40.327827 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5fr9f_a2d70830-0860-467c-918f-48559b25bef3/manager/0.log" Apr 22 14:09:19.749226 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:19.749188 2567 generic.go:358] "Generic (PLEG): container finished" podID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerID="59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c" exitCode=0 Apr 22 14:09:19.749226 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:19.749231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jhrrk/must-gather-thtz7" event={"ID":"14cd1d5f-2f87-4dde-a626-e6197565c625","Type":"ContainerDied","Data":"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c"} Apr 22 14:09:19.749668 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:19.749584 2567 scope.go:117] "RemoveContainer" containerID="59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c" Apr 22 14:09:19.890443 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:19.890409 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrrk_must-gather-thtz7_14cd1d5f-2f87-4dde-a626-e6197565c625/gather/0.log" Apr 22 14:09:20.395794 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.395708 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98vmz/must-gather-hb2d6"] Apr 22 14:09:20.399368 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.399342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.402249 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.402223 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98vmz\"/\"openshift-service-ca.crt\"" Apr 22 14:09:20.402371 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.402246 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98vmz\"/\"kube-root-ca.crt\"" Apr 22 14:09:20.402371 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.402225 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-98vmz\"/\"default-dockercfg-254cb\"" Apr 22 14:09:20.406385 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.406300 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/must-gather-hb2d6"] Apr 22 14:09:20.545977 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.545936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckct\" (UniqueName: \"kubernetes.io/projected/8fd02574-44ad-401e-87c3-4d3450305514-kube-api-access-xckct\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.545977 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.545983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8fd02574-44ad-401e-87c3-4d3450305514-must-gather-output\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.646501 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.646399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xckct\" (UniqueName: \"kubernetes.io/projected/8fd02574-44ad-401e-87c3-4d3450305514-kube-api-access-xckct\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.646501 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.646455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8fd02574-44ad-401e-87c3-4d3450305514-must-gather-output\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.646830 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.646809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8fd02574-44ad-401e-87c3-4d3450305514-must-gather-output\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.654487 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.654460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckct\" (UniqueName: \"kubernetes.io/projected/8fd02574-44ad-401e-87c3-4d3450305514-kube-api-access-xckct\") pod \"must-gather-hb2d6\" (UID: \"8fd02574-44ad-401e-87c3-4d3450305514\") " pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.708527 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.708502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/must-gather-hb2d6" Apr 22 14:09:20.824041 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:20.824015 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/must-gather-hb2d6"] Apr 22 14:09:20.826446 ip-10-0-142-133 kubenswrapper[2567]: W0422 14:09:20.826417 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd02574_44ad_401e_87c3_4d3450305514.slice/crio-bcd6789a9e2f355f641036f875e13e46c2b1d84d8d12556a0f8871f81b599068 WatchSource:0}: Error finding container bcd6789a9e2f355f641036f875e13e46c2b1d84d8d12556a0f8871f81b599068: Status 404 returned error can't find the container with id bcd6789a9e2f355f641036f875e13e46c2b1d84d8d12556a0f8871f81b599068 Apr 22 14:09:21.757952 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:21.757912 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/must-gather-hb2d6" event={"ID":"8fd02574-44ad-401e-87c3-4d3450305514","Type":"ContainerStarted","Data":"bcd6789a9e2f355f641036f875e13e46c2b1d84d8d12556a0f8871f81b599068"} Apr 22 14:09:22.767829 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:22.766956 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/must-gather-hb2d6" event={"ID":"8fd02574-44ad-401e-87c3-4d3450305514","Type":"ContainerStarted","Data":"2751ce43d11715461f033b008c11b5b798b0a8f3349f6420739a9b3e5a2c04eb"} Apr 22 14:09:22.767829 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:22.767005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/must-gather-hb2d6" event={"ID":"8fd02574-44ad-401e-87c3-4d3450305514","Type":"ContainerStarted","Data":"1245a7e4c0d46e2838e09a67b4cbb43e2c146e514d28abb0beae4493ec7026aa"} Apr 22 14:09:22.784387 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:22.784332 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98vmz/must-gather-hb2d6" podStartSLOduration=1.912406485 podStartE2EDuration="2.784316421s" podCreationTimestamp="2026-04-22 14:09:20 +0000 UTC" firstStartedPulling="2026-04-22 14:09:20.828216289 +0000 UTC m=+2877.481320048" lastFinishedPulling="2026-04-22 14:09:21.700126221 +0000 UTC m=+2878.353229984" observedRunningTime="2026-04-22 14:09:22.782622348 +0000 UTC m=+2879.435726133" watchObservedRunningTime="2026-04-22 14:09:22.784316421 +0000 UTC m=+2879.437420223" Apr 22 14:09:23.126820 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:09:23.126734 2567 log.go:32] "Failed when parsing line in log file" err="unexpected timestamp format \"2006-01-02T15:04:05.999999999Z07:00\": parsing time \"2026-02026-04-22T13:57:48.579657901+00:00\" as \"2006-01-02T15:04:05.999999999Z07:00\": cannot parse \"026-04-22T13:57:48.579657901+00:00\" as \"-\"" path="/var/log/pods/kube-system_global-pull-secret-syncer-2bm2k_2e1f3eb0-8599-43eb-a51e-a087b49b8c3a/global-pull-secret-syncer/0.log" line="2026-02026-04-22T13:57:48.579657901+00:00 stderr F {\"level\":\"info\",\"timestamp\":\"2026-04-22T13:57:48Z\",\"caller\":\"sync-global-pullsecret/sync-global-pullsecret.go:164\",\"msg\":\"Syncing global pull secret\"}\n" Apr 22 14:09:23.128479 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:23.128452 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2bm2k_2e1f3eb0-8599-43eb-a51e-a087b49b8c3a/global-pull-secret-syncer/0.log" Apr 22 14:09:23.343486 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:23.343444 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xfrxg_105c8108-1f35-4d1b-8170-b9f59625a7c3/konnectivity-agent/0.log" Apr 22 14:09:23.410316 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:23.410207 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-133.ec2.internal_e2e036ce9e751626ebdb05be6b1bff59/haproxy/0.log" Apr 22 14:09:25.211268 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.211231 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jhrrk/must-gather-thtz7"] Apr 22 14:09:25.212281 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.212246 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jhrrk/must-gather-thtz7" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="copy" containerID="cri-o://c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752" gracePeriod=2 Apr 22 14:09:25.214317 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.214289 2567 status_manager.go:895] "Failed to get status for pod" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" pod="openshift-must-gather-jhrrk/must-gather-thtz7" err="pods \"must-gather-thtz7\" is forbidden: User \"system:node:ip-10-0-142-133.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jhrrk\": no relationship found between node 'ip-10-0-142-133.ec2.internal' and this object" Apr 22 14:09:25.218019 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.217274 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jhrrk/must-gather-thtz7"] Apr 22 14:09:25.629188 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.626728 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrrk_must-gather-thtz7_14cd1d5f-2f87-4dde-a626-e6197565c625/copy/0.log" Apr 22 14:09:25.629188 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.627243 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:09:25.629188 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.628893 2567 status_manager.go:895] "Failed to get status for pod" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" pod="openshift-must-gather-jhrrk/must-gather-thtz7" err="pods \"must-gather-thtz7\" is forbidden: User \"system:node:ip-10-0-142-133.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jhrrk\": no relationship found between node 'ip-10-0-142-133.ec2.internal' and this object" Apr 22 14:09:25.779382 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.779349 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jhrrk_must-gather-thtz7_14cd1d5f-2f87-4dde-a626-e6197565c625/copy/0.log" Apr 22 14:09:25.779772 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.779741 2567 generic.go:358] "Generic (PLEG): container finished" podID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerID="c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752" exitCode=143 Apr 22 14:09:25.779910 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.779806 2567 scope.go:117] "RemoveContainer" containerID="c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752" Apr 22 14:09:25.779977 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.779942 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jhrrk/must-gather-thtz7" Apr 22 14:09:25.781890 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.781846 2567 status_manager.go:895] "Failed to get status for pod" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" pod="openshift-must-gather-jhrrk/must-gather-thtz7" err="pods \"must-gather-thtz7\" is forbidden: User \"system:node:ip-10-0-142-133.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jhrrk\": no relationship found between node 'ip-10-0-142-133.ec2.internal' and this object" Apr 22 14:09:25.786775 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.786718 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output\") pod \"14cd1d5f-2f87-4dde-a626-e6197565c625\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " Apr 22 14:09:25.786970 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.786933 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqzj7\" (UniqueName: \"kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7\") pod \"14cd1d5f-2f87-4dde-a626-e6197565c625\" (UID: \"14cd1d5f-2f87-4dde-a626-e6197565c625\") " Apr 22 14:09:25.789778 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.789740 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7" (OuterVolumeSpecName: "kube-api-access-rqzj7") pod "14cd1d5f-2f87-4dde-a626-e6197565c625" (UID: "14cd1d5f-2f87-4dde-a626-e6197565c625"). InnerVolumeSpecName "kube-api-access-rqzj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:09:25.791887 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.791273 2567 scope.go:117] "RemoveContainer" containerID="59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c" Apr 22 14:09:25.797186 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.794291 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "14cd1d5f-2f87-4dde-a626-e6197565c625" (UID: "14cd1d5f-2f87-4dde-a626-e6197565c625"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:09:25.816032 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.815970 2567 scope.go:117] "RemoveContainer" containerID="c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752" Apr 22 14:09:25.816657 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:09:25.816427 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752\": container with ID starting with c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752 not found: ID does not exist" containerID="c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752" Apr 22 14:09:25.816657 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.816463 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752"} err="failed to get container status \"c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752\": rpc error: code = NotFound desc = could not find container \"c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752\": container with ID starting with c3099ca38c27cedbda278f1a92544e96df9e80820523617a247fbf68214c4752 not found: ID does not exist" Apr 22 14:09:25.816657 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.816489 2567 scope.go:117] "RemoveContainer" containerID="59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c" Apr 22 14:09:25.827369 ip-10-0-142-133 kubenswrapper[2567]: E0422 14:09:25.816983 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c\": container with ID starting with 59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c not found: ID does not exist" containerID="59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c" Apr 22 14:09:25.827369 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.826859 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c"} err="failed to get container status \"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c\": rpc error: code = NotFound desc = could not find container \"59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c\": container with ID starting with 59b69331cb55d96f94b6d07c8c64f6173f8d426651c324cee59f7e500cfa900c not found: ID does not exist" Apr 22 14:09:25.888648 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.888540 2567 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14cd1d5f-2f87-4dde-a626-e6197565c625-must-gather-output\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 14:09:25.888648 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.888579 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqzj7\" (UniqueName: \"kubernetes.io/projected/14cd1d5f-2f87-4dde-a626-e6197565c625-kube-api-access-rqzj7\") on node \"ip-10-0-142-133.ec2.internal\" DevicePath \"\"" Apr 22 14:09:25.949983 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:25.949949 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" path="/var/lib/kubelet/pods/14cd1d5f-2f87-4dde-a626-e6197565c625/volumes" Apr 22 14:09:26.510043 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.510013 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srp4x_61c05769-3e86-4c56-836b-6696f02722af/kube-state-metrics/0.log" Apr 22 14:09:26.533974 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.533944 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srp4x_61c05769-3e86-4c56-836b-6696f02722af/kube-rbac-proxy-main/0.log" Apr 22 14:09:26.556278 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.556244 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srp4x_61c05769-3e86-4c56-836b-6696f02722af/kube-rbac-proxy-self/0.log" Apr 22 14:09:26.604846 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.604815 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-vmnfx_d883f20c-5cff-43c8-ac4c-ca28964b1e8e/monitoring-plugin/0.log" Apr 22 14:09:26.691617 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.691587 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jm87x_e6cb5329-b117-42b2-8a20-2d8bbd3ccc40/node-exporter/0.log" Apr 22 14:09:26.714489 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.714457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jm87x_e6cb5329-b117-42b2-8a20-2d8bbd3ccc40/kube-rbac-proxy/0.log" Apr 22 14:09:26.736420 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:26.736387 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jm87x_e6cb5329-b117-42b2-8a20-2d8bbd3ccc40/init-textfile/0.log" Apr 22 14:09:27.071146 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.071112 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-md289_a04764b3-e96d-4a35-8c3a-14a1c7c86599/prometheus-operator/0.log" Apr 22 14:09:27.089966 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.089926 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-md289_a04764b3-e96d-4a35-8c3a-14a1c7c86599/kube-rbac-proxy/0.log" Apr 22 14:09:27.215627 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.215596 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/thanos-query/0.log" Apr 22 14:09:27.238485 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.238457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/kube-rbac-proxy-web/0.log" Apr 22 14:09:27.277977 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.277944 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/kube-rbac-proxy/0.log" Apr 22 14:09:27.297800 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.297766 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/prom-label-proxy/0.log" Apr 22 14:09:27.320832 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.320800 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/kube-rbac-proxy-rules/0.log" Apr 22 14:09:27.339483 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:27.339407 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-86696546f4-v6h8j_8e7f151a-ce1d-4348-8440-269a9010bb2b/kube-rbac-proxy-metrics/0.log" Apr 22 14:09:30.337059 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.337031 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gh69k_352d4472-2ff1-4835-b7bb-78277c591127/dns/0.log" Apr 22 14:09:30.346161 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346129 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v"] Apr 22 14:09:30.346440 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346427 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="copy" Apr 22 14:09:30.346492 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346442 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="copy" Apr 22 14:09:30.346492 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346458 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="gather" Apr 22 14:09:30.346492 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346464 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="gather" Apr 22 14:09:30.346599 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346515 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="copy" Apr 22 14:09:30.346599 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.346526 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="14cd1d5f-2f87-4dde-a626-e6197565c625" containerName="gather" Apr 22 14:09:30.350957 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.350822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.359261 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.359236 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v"] Apr 22 14:09:30.369273 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.369253 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gh69k_352d4472-2ff1-4835-b7bb-78277c591127/kube-rbac-proxy/0.log" Apr 22 14:09:30.426400 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.426368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-sys\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.426548 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.426418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-lib-modules\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.426548 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.426482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbhd2\" (UniqueName: \"kubernetes.io/projected/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-kube-api-access-jbhd2\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.426548 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.426540 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-proc\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.426673 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.426572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-podres\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.517189 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.517133 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pjr6r_1f59a955-0700-417a-a991-8b127c7e9438/dns-node-resolver/0.log" Apr 22 14:09:30.527513 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-proc\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527644 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-podres\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527644 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-sys\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527644 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-lib-modules\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527644 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527612 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-proc\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527778 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbhd2\" (UniqueName: \"kubernetes.io/projected/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-kube-api-access-jbhd2\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527778 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-sys\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527778 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-podres\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.527870 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.527775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-lib-modules\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.537368 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.535806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbhd2\" (UniqueName: \"kubernetes.io/projected/ca364ac1-4f4b-40a6-9333-ed2d9790b88c-kube-api-access-jbhd2\") pod \"perf-node-gather-daemonset-jkj4v\" (UID: \"ca364ac1-4f4b-40a6-9333-ed2d9790b88c\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.662379 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.662303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:30.804342 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.804315 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v"] Apr 22 14:09:30.955607 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:30.955585 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jgtzf_8c3da84b-ac7e-4b33-9e51-bda762f06468/node-ca/0.log" Apr 22 14:09:31.805421 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:31.805383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" event={"ID":"ca364ac1-4f4b-40a6-9333-ed2d9790b88c","Type":"ContainerStarted","Data":"2595e195ee86829b92c27f003ad8b00dceab483f9a6a2c03c75fe3b42618632d"} Apr 22 14:09:31.805998 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:31.805957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" event={"ID":"ca364ac1-4f4b-40a6-9333-ed2d9790b88c","Type":"ContainerStarted","Data":"c8e1d200fbb503406f00868230c457f9841f75bbd381f028728b26893b99fc16"} Apr 22 14:09:31.806421 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:31.806388 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:31.822248 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:31.822194 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" podStartSLOduration=1.8221788559999998 podStartE2EDuration="1.822178856s" podCreationTimestamp="2026-04-22 14:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:09:31.820050473 +0000 UTC m=+2888.473154255" watchObservedRunningTime="2026-04-22 14:09:31.822178856 +0000 UTC m=+2888.475282630" Apr 22 14:09:31.937295 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:31.937240 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bh9wx_15464d00-15bf-4f8f-a47f-ffeac13f32c5/serve-healthcheck-canary/0.log" Apr 22 14:09:32.357947 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:32.357917 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9r9bf_e1257ae3-f69b-4b5c-b3ff-2400607495ed/kube-rbac-proxy/0.log" Apr 22 14:09:32.376716 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:32.376695 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9r9bf_e1257ae3-f69b-4b5c-b3ff-2400607495ed/exporter/0.log" Apr 22 14:09:32.395537 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:32.395511 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9r9bf_e1257ae3-f69b-4b5c-b3ff-2400607495ed/extractor/0.log" Apr 22 14:09:34.025609 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:34.025574 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-4bdjk_2e2ee773-5689-44fb-8fb0-cfdb5d688559/jobset-operator/0.log" Apr 22 14:09:37.822081 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:37.822051 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-jkj4v" Apr 22 14:09:38.122759 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.122681 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/kube-multus-additional-cni-plugins/0.log" Apr 22 14:09:38.143302 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.143271 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/egress-router-binary-copy/0.log" Apr 22 14:09:38.161592 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.161556 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/cni-plugins/0.log" Apr 22 14:09:38.180085 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.180057 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/bond-cni-plugin/0.log" Apr 22 14:09:38.198738 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.198710 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/routeoverride-cni/0.log" Apr 22 14:09:38.217310 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.217279 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/whereabouts-cni-bincopy/0.log" Apr 22 14:09:38.236560 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.236528 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-97vcn_52ffca56-a3d8-49c4-b537-b0fc46ac5d2c/whereabouts-cni/0.log" Apr 22 14:09:38.454671 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.454593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k8gsc_220972eb-8140-47aa-bef6-2fc6a45d677d/kube-multus/0.log" Apr 22 14:09:38.559359 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.559331 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbkn6_1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8/network-metrics-daemon/0.log" Apr 22 14:09:38.580213 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:38.580178 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbkn6_1ecfd1aa-fe6d-4a7e-a343-9807fd0d3bc8/kube-rbac-proxy/0.log" Apr 22 14:09:39.986191 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:39.986116 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-controller/0.log" Apr 22 14:09:40.001513 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.001481 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/0.log" Apr 22 14:09:40.027761 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.027738 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovn-acl-logging/1.log" Apr 22 14:09:40.044544 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.044517 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/kube-rbac-proxy-node/0.log" Apr 22 14:09:40.064136 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.064103 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:09:40.081657 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.081593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/northd/0.log" Apr 22 14:09:40.099053 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.099029 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/nbdb/0.log" Apr 22 14:09:40.117013 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.116986 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/sbdb/0.log" Apr 22 14:09:40.305449 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:40.305412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n7z2m_474b725c-806b-45d7-b14a-c4e4d0f026cd/ovnkube-controller/0.log" Apr 22 14:09:41.299889 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:41.299858 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jklhk_2887fb17-5e94-487b-9353-de32227a0d91/network-check-target-container/0.log" Apr 22 14:09:42.140670 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:42.140637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-59t8w_1d87803a-e452-4d88-ab3e-466482b69647/iptables-alerter/0.log" Apr 22 14:09:42.741269 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:42.741235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nxh6k_225bdcdd-db73-48cb-b288-6a1d2423f8df/tuned/0.log" Apr 22 14:09:46.249391 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:46.249359 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ms86p_5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4/csi-driver/0.log" Apr 22 14:09:46.274155 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:46.274123 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ms86p_5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4/csi-node-driver-registrar/0.log" Apr 22 14:09:46.297805 ip-10-0-142-133 kubenswrapper[2567]: I0422 14:09:46.297779 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ms86p_5aad924b-2ce8-4efb-a7b1-7b9e7e09e7f4/csi-liveness-probe/0.log"