Apr 21 07:02:54.850793 ip-10-0-143-69 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:02:55.274655 ip-10-0-143-69 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:02:55.274655 ip-10-0-143-69 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:02:55.274655 ip-10-0-143-69 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:02:55.274655 ip-10-0-143-69 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:02:55.274655 ip-10-0-143-69 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:02:55.275799 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.275245 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:02:55.282424 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282405 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:02:55.282424 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282423 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:02:55.282424 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282428 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282431 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282434 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282437 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282440 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282443 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282445 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282448 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282451 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282454 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282457 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282459 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282462 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282465 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282467 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282470 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282473 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282475 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282480 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:02:55.282562 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282484 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282487 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282489 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282492 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282495 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282497 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282500 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282502 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282521 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282530 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282534 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282538 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282541 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282544 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282547 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282550 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282553 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282556 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282558 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282561 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:02:55.283012 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282564 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282567 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282569 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282573 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282578 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282580 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282583 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282586 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282589 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282592 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282615 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282618 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282622 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282625 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282628 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282631 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282633 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282636 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282639 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282641 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:02:55.283551 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282644 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282646 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282649 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282652 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282654 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282657 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282660 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282664 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282667 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282669 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282672 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282675 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282678 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282680 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282683 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282685 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282689 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282692 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282695 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282698 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:02:55.284037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282700 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282703 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282705 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282708 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.282710 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283110 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283116 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283119 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283122 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283125 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283128 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283130 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283133 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283136 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283139 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283141 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283144 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283147 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283150 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283152 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:02:55.284544 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283155 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283157 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283160 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283163 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283165 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283168 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283170 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283175 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283178 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283182 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283185 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283188 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283191 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283194 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283196 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283199 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283201 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283204 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283206 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:02:55.285055 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283209 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283212 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283214 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283217 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283219 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283222 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283224 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283227 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283230 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283232 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283235 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283237 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283240 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283243 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283246 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283248 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283250 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283253 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283255 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283258 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:02:55.285545 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283261 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283263 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283265 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283269 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283271 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283273 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283276 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283279 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283281 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283283 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283286 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283288 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283291 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283293 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283296 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283299 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283301 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283304 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283308 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283311 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:02:55.286080 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283313 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283316 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283318 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283321 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283323 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283326 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283328 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283331 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283333 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283336 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283338 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.283341 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283408 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283414 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283421 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283425 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283434 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283438 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283443 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283447 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283450 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:02:55.286591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283453 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283457 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283460 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283463 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283466 2573 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283469 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283472 2573 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283477 2573 flags.go:64] FLAG: --cloud-config="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283480 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283482 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283486 2573 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283489 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283492 2573 flags.go:64] FLAG: --config-dir="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283495 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283499 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283503 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283519 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283523 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283526 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283529 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283532 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283535 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283538 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283541 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283545 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:02:55.287097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283548 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283552 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283555 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283560 2573 flags.go:64] FLAG: --enable-server="true" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283563 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283567 2573 flags.go:64] FLAG: --event-burst="100" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283571 2573 flags.go:64] FLAG: --event-qps="50" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283574 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283577 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283580 2573 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283583 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283586 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283589 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283592 2573 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283596 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283599 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283602 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283606 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283609 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283612 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283615 2573 flags.go:64] FLAG: --feature-gates="" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283619 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283622 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283625 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283628 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283631 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:02:55.287719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283634 2573 flags.go:64] FLAG: --help="false" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283637 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283640 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283643 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283645 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283649 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283652 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283655 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283658 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283660 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283664 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283666 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283669 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283672 2573 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283676 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283679 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283682 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283685 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283688 2573 flags.go:64] FLAG: --lock-file="" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283691 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283695 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283698 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283703 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:02:55.288346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283706 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283709 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283712 2573 flags.go:64] FLAG: --logging-format="text" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283715 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283718 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283721 2573 flags.go:64] FLAG: --manifest-url="" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283724 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283728 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283731 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283735 2573 flags.go:64] FLAG: --max-pods="110" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283738 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283741 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283744 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283747 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283750 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283753 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283756 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283763 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283766 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283769 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283772 2573 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283775 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283780 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283783 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:02:55.288934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283788 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283791 2573 flags.go:64] FLAG: --port="10250" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283795 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283797 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04b2c460e18fc1752" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283800 2573 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283804 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283807 2573 flags.go:64] FLAG: --register-node="true" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283810 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283813 2573 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283817 2573 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283819 2573 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283822 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283825 2573 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283829 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283832 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283835 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283837 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283840 2573 flags.go:64] FLAG: --runonce="false" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283843 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283846 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283849 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283852 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283854 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283857 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283860 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283863 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:02:55.289588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283866 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283869 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283872 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283875 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283878 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283881 2573 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283886 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283891 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283894 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283897 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283900 2573 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283905 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283908 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283911 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283914 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283917 2573 flags.go:64] FLAG: --v="2" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283920 2573 flags.go:64] FLAG: --version="false" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283924 2573 flags.go:64] FLAG: --vmodule="" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283929 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.283932 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284016 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284019 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284023 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284026 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:02:55.290205 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284029 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284032 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284035 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284038 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284041 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284043 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284047 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284049 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284052 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284055 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284058 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284061 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284064 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284066 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284070 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284073 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284075 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284078 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284081 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284084 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:02:55.290800 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284087 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284089 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284092 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284095 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284097 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284100 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284103 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284105 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284108 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284110 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284113 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284115 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284118 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284120 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284123 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284125 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284128 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284130 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284133 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284135 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:02:55.291302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284138 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284140 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284143 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284145 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284148 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284151 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284155 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284157 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284160 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284162 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284165 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284168 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284171 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284173 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284176 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284178 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284181 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284184 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284186 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:02:55.291819 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284188 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284191 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284194 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284196 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284199 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284201 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284204 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284207 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284212 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284215 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284219 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284222 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284225 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284228 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284231 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284234 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284237 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284239 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284242 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:02:55.292285 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284247 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284250 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284253 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.284256 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.284814 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.290426 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.290442 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290489 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290493 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290497 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290500 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290503 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290526 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290531 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290535 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290538 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:02:55.292768 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290541 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290544 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290546 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290549 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290552 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290555 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290557 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290560 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290563 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290565 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290569 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290571 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290574 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290576 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290579 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290581 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290584 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290586 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290589 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290592 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:02:55.293169 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290594 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290596 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290601 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290605 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290609 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290612 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290615 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290618 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290621 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290624 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290626 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290629 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290632 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290634 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290637 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290640 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290642 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290645 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290647 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:02:55.293693 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290650 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290653 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290655 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290657 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290660 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290662 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290665 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290669 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290672 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290675 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290678 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290680 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290683 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290685 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290688 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290692 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290695 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290697 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290700 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:02:55.294167 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290702 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290705 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290707 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290710 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290712 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290715 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290718 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290720 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290723 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290725 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290728 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290730 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290733 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290735 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290738 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290740 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290743 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290745 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:02:55.294646 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290747 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.290753 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290846 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290851 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290854 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290856 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290859 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290862 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290864 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290867 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290869 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290872 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290875 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290878 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290880 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290883 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:02:55.295133 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290886 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290888 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290891 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290893 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290896 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290898 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290901 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290903 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290905 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290908 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290911 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290913 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290916 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290918 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290921 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290923 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290925 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290928 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290930 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290933 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:02:55.295683 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290935 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290938 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290940 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290942 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290945 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290947 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290949 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290952 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290954 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290957 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290960 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290963 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290965 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290967 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290971 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290973 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290976 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290978 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290980 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290983 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:02:55.296161 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290985 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290988 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290990 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290993 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290995 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.290998 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291000 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291003 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291005 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291008 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291010 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291013 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291015 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291018 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291021 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291025 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291028 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291031 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291034 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291037 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:02:55.296665 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291039 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291042 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291046 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291050 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291052 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291055 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291058 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291061 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291063 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291066 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291068 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:55.291071 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.291075 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.291709 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:02:55.297144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.295372 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:02:55.297496 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.296338 2573 server.go:1019] "Starting client certificate rotation" Apr 21 07:02:55.297496 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.296433 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:02:55.297496 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.296474 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:02:55.318945 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.318928 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:02:55.321721 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.321699 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:02:55.335449 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.335426 2573 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:02:55.341300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.341284 2573 log.go:25] "Validated CRI v1 image API" Apr 21 07:02:55.343273 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.343253 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:02:55.348802 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.348778 2573 fs.go:135] Filesystem UUIDs: map[170f521f-8ee9-4a79-bdbd-22664c2baa80:/dev/nvme0n1p3 719a271e-c63e-4dec-b4f9-4e903a168b07:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 21 07:02:55.348876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.348800 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:02:55.352869 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.352852 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:02:55.354438 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.354333 2573 manager.go:217] Machine: {Timestamp:2026-04-21 07:02:55.35239269 +0000 UTC m=+0.384443908 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3084608 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2760750a51732efea69558d2b8909a SystemUUID:ec276075-0a51-732e-fea6-9558d2b8909a BootID:3b046ef3-e1de-4b58-a472-3c4ef2bf2ce8 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8e:68:bc:72:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8e:68:bc:72:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:c0:ad:98:3c:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:02:55.354438 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.354434 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:02:55.354565 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.354504 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:02:55.355677 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.355658 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:02:55.355822 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.355678 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-69.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:02:55.355872 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.355832 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:02:55.355872 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.355841 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:02:55.355872 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.355854 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:02:55.356440 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.356431 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:02:55.357645 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.357635 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:02:55.357745 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.357736 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:02:55.359646 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.359636 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:02:55.359703 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.359651 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:02:55.359703 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.359663 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:02:55.359703 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.359672 2573 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:02:55.359703 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.359680 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:02:55.360785 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.360771 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:02:55.360831 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.360798 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:02:55.363393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.363377 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:02:55.365027 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.365011 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:02:55.366156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366135 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366165 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366175 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366185 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366202 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366213 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366226 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:02:55.366240 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366237 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:02:55.366504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366247 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:02:55.366504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366256 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:02:55.366504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366281 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:02:55.366504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.366295 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:02:55.367159 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.367147 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:02:55.367211 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.367162 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:02:55.370386 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.370371 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:02:55.370463 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.370411 2573 server.go:1295] "Started kubelet" Apr 21 07:02:55.370572 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.370498 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:02:55.370646 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.370560 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:02:55.370693 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.370682 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:02:55.371240 ip-10-0-143-69 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:02:55.375458 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.372315 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:02:55.376330 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.376307 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-69.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:02:55.376534 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.376458 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:02:55.376773 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.376749 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-69.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:02:55.378827 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.378811 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:02:55.380630 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.380611 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:02:55.381235 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.381065 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:02:55.381725 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.379318 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-69.ec2.internal.18a84d3c87dc11ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-69.ec2.internal,UID:ip-10-0-143-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-69.ec2.internal,},FirstTimestamp:2026-04-21 07:02:55.370383855 +0000 UTC m=+0.402435078,LastTimestamp:2026-04-21 07:02:55.370383855 +0000 UTC m=+0.402435078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-69.ec2.internal,}" Apr 21 07:02:55.382145 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382118 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:02:55.382145 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382145 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:02:55.382288 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382214 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:02:55.382335 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382294 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:02:55.382335 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382323 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:02:55.382460 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.382442 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:02:55.382560 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382539 2573 factory.go:55] Registering systemd factory Apr 21 07:02:55.382719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382565 2573 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:02:55.382833 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382822 2573 factory.go:153] Registering CRI-O factory Apr 21 07:02:55.382909 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382838 2573 factory.go:223] Registration of the crio container factory successfully Apr 21 07:02:55.382909 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382891 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:02:55.383003 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382915 2573 factory.go:103] Registering Raw factory Apr 21 07:02:55.383003 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.382935 2573 manager.go:1196] Started watching for new ooms in manager Apr 21 07:02:55.383422 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.383403 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.383871 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.383845 2573 manager.go:319] Starting recovery of all containers Apr 21 07:02:55.391725 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.391696 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 07:02:55.391791 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.391726 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-69.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 07:02:55.393845 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.393831 2573 manager.go:324] Recovery completed Apr 21 07:02:55.397583 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.397571 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.399890 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.399866 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.399951 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.399905 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.399951 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.399916 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.400377 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.400363 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:02:55.400377 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.400376 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:02:55.400480 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.400395 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:02:55.400676 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.400661 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kbzrz" Apr 21 07:02:55.401943 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.401883 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-69.ec2.internal.18a84d3c899e4bb3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-69.ec2.internal,UID:ip-10-0-143-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-69.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-69.ec2.internal,},FirstTimestamp:2026-04-21 07:02:55.399889843 +0000 UTC m=+0.431941063,LastTimestamp:2026-04-21 07:02:55.399889843 +0000 UTC m=+0.431941063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-69.ec2.internal,}" Apr 21 07:02:55.402977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.402958 2573 policy_none.go:49] "None policy: Start" Apr 21 07:02:55.402977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.402976 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:02:55.403089 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.402990 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:02:55.407981 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.407959 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kbzrz" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439080 2573 manager.go:341] "Starting Device Plugin manager" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.439121 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439132 2573 server.go:85] "Starting device plugin registration server" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439350 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439360 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439453 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439541 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.439550 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.440186 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:02:55.451220 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.440255 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.529670 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.529600 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:02:55.530787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.530769 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:02:55.530897 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.530795 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:02:55.530897 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.530816 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:02:55.530897 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.530823 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:02:55.530897 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.530852 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:02:55.533741 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.533717 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:55.539919 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.539900 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.540788 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.540773 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.540871 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.540801 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.540871 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.540815 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.540871 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.540838 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.547270 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.547256 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.547329 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.547276 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-69.ec2.internal\": node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.584044 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.584018 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.631744 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.631719 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal"] Apr 21 07:02:55.631838 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.631789 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.632722 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.632706 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.632804 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.632732 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.632804 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.632743 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.634916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.634904 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.635057 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.635045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.635103 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.635071 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.635699 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.635685 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.635761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.635713 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.635761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.635723 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.636083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.636070 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.636145 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.636094 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.636145 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.636107 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.638633 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.638619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.638693 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.638643 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:02:55.639667 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.639641 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:02:55.639758 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.639671 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:02:55.639758 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.639685 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:02:55.658070 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.658048 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-69.ec2.internal\" not found" node="ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.662376 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.662361 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-69.ec2.internal\" not found" node="ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.683837 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.683816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/192bfeaa4c26d06d04fe2b9437ecbb37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-69.ec2.internal\" (UID: \"192bfeaa4c26d06d04fe2b9437ecbb37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.683903 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.683847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.683903 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.683864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.684863 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.684849 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.784710 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/192bfeaa4c26d06d04fe2b9437ecbb37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-69.ec2.internal\" (UID: \"192bfeaa4c26d06d04fe2b9437ecbb37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.784710 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/192bfeaa4c26d06d04fe2b9437ecbb37-config\") pod \"kube-apiserver-proxy-ip-10-0-143-69.ec2.internal\" (UID: \"192bfeaa4c26d06d04fe2b9437ecbb37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.784881 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.784881 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.784881 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.784881 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.784818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bce90b98ae5ab75e21e2fc6e3c6353e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal\" (UID: \"bce90b98ae5ab75e21e2fc6e3c6353e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.785669 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.785653 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.886414 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.886387 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:55.960622 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.960589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.965167 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:55.965145 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:55.987015 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:55.986991 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.087569 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.087488 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.187915 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.187891 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.267125 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.267100 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:56.278714 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.278688 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:56.288065 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.288044 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.296198 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.296182 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:02:56.296299 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.296283 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:02:56.296352 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.296333 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:02:56.296389 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.296333 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:02:56.381391 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.381371 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:02:56.388219 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.388196 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.394468 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.394450 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:02:56.409569 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.409537 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 06:57:55 +0000 UTC" deadline="2027-10-09 04:20:15.623244595 +0000 UTC" Apr 21 07:02:56.409669 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.409570 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12861h17m19.213678684s" Apr 21 07:02:56.426135 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.426116 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cr7kg" Apr 21 07:02:56.432844 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.432824 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cr7kg" Apr 21 07:02:56.483432 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:56.483405 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192bfeaa4c26d06d04fe2b9437ecbb37.slice/crio-648c73c77a76d53a7f05e9355ac20690eaf0b75ee61f732d560592862e29583f WatchSource:0}: Error finding container 648c73c77a76d53a7f05e9355ac20690eaf0b75ee61f732d560592862e29583f: Status 404 returned error can't find the container with id 648c73c77a76d53a7f05e9355ac20690eaf0b75ee61f732d560592862e29583f Apr 21 07:02:56.487997 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.487983 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:02:56.488261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.488240 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.534175 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.534134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" event={"ID":"192bfeaa4c26d06d04fe2b9437ecbb37","Type":"ContainerStarted","Data":"648c73c77a76d53a7f05e9355ac20690eaf0b75ee61f732d560592862e29583f"} Apr 21 07:02:56.588567 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.588544 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.643048 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:56.643022 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce90b98ae5ab75e21e2fc6e3c6353e7.slice/crio-cfb441f2b62e2a350e13816a283dfd63319631ec68d0d01a5b197d6b8979a8fe WatchSource:0}: Error finding container cfb441f2b62e2a350e13816a283dfd63319631ec68d0d01a5b197d6b8979a8fe: Status 404 returned error can't find the container with id cfb441f2b62e2a350e13816a283dfd63319631ec68d0d01a5b197d6b8979a8fe Apr 21 07:02:56.688849 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:56.688826 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-69.ec2.internal\" not found" Apr 21 07:02:56.715391 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.715367 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:56.782433 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.782409 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" Apr 21 07:02:56.795497 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.795476 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:02:56.796719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.796706 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" Apr 21 07:02:56.809634 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:56.809617 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:02:57.147522 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.147476 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:57.360661 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.360555 2573 apiserver.go:52] "Watching apiserver" Apr 21 07:02:57.368802 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.368775 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:02:57.371152 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.371117 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l","openshift-cluster-node-tuning-operator/tuned-xxdx2","openshift-dns/node-resolver-q7rfs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal","openshift-multus/multus-additional-cni-plugins-rlt5l","openshift-network-operator/iptables-alerter-48872","kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal","openshift-image-registry/node-ca-zzsbr","openshift-multus/multus-6krhs","openshift-multus/network-metrics-daemon-qzgxt","openshift-network-diagnostics/network-check-target-qv6dn","openshift-ovn-kubernetes/ovnkube-node-xt7hd","kube-system/konnectivity-agent-xn7rw"] Apr 21 07:02:57.373796 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.373762 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.375872 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.375851 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.377733 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.377713 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:02:57.377827 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.377755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:02:57.378034 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378016 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:02:57.378123 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378016 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.378190 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378175 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.378363 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rlf59\"" Apr 21 07:02:57.378490 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378477 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:02:57.378958 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378939 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.379041 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.379001 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x786t\"" Apr 21 07:02:57.379118 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.378948 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.380425 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.380407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.382843 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.382667 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.382843 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.382724 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.382843 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.382822 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.383236 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.383204 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-p2lfm\"" Apr 21 07:02:57.385244 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.385218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.385333 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.385312 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.386285 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386259 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:02:57.386435 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386290 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.386698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.386983 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386869 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:02:57.386983 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386939 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-66w7g\"" Apr 21 07:02:57.387136 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.386996 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:02:57.387430 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.387310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.387738 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.387718 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.388028 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.388012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.389046 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.388842 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:02:57.389046 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.389031 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2zr5j\"" Apr 21 07:02:57.389218 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.389184 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:02:57.389382 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.389364 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-st4qj\"" Apr 21 07:02:57.391184 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.391161 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.392316 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392207 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:02:57.392316 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392241 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rgdw7\"" Apr 21 07:02:57.392868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-registration-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.392970 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cnibin\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.392970 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-var-lib-kubelet\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.393076 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.392974 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2tgr\" (UniqueName: \"kubernetes.io/projected/f64f9328-2e8e-457d-ab14-8b16c32be65a-kube-api-access-t2tgr\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.393076 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393021 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-slash\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.393076 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.393263 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-host-slash\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.393263 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-kubelet\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.393263 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.393263 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-tmp-dir\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.393739 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393472 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:02:57.393739 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-device-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.393739 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-sys-fs\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.393739 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393684 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k29n\" (UniqueName: \"kubernetes.io/projected/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-kube-api-access-2k29n\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.393974 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-iptables-alerter-script\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.393974 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovn-node-metrics-cert\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.393974 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.393907 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-binary-copy\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394277 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-bin\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-systemd-units\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-log-socket\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-netd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-lib-modules\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtbh\" (UniqueName: \"kubernetes.io/projected/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kube-api-access-6dtbh\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-ovn\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-var-lib-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-os-release\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-tuned\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394684 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-tmp\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6ml\" (UniqueName: \"kubernetes.io/projected/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-kube-api-access-jw6ml\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.394761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-run\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysconfig\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-sys\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-socket-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.394956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-config\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-system-cni-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.395163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ws26\" (UniqueName: \"kubernetes.io/projected/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-kube-api-access-5ws26\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395225 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-systemd\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2l2n\" (UniqueName: \"kubernetes.io/projected/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-kube-api-access-t2l2n\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-netns\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-etc-selinux\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-systemd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-etc-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-modprobe-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-env-overrides\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-kubernetes\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-hosts-file\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-node-log\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-script-lib\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.395987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-conf\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.396301 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.395990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-host\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.396301 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.396023 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.396301 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.396190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.398290 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398269 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:02:57.398370 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398279 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:02:57.398876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398665 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:02:57.398876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:02:57.398876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398765 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-b6xpz\"" Apr 21 07:02:57.398876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.398803 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-57kdv\"" Apr 21 07:02:57.400149 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.400131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.400701 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.400605 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:02:57.402443 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.402426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:57.402547 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.402479 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:02:57.433461 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.433409 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:57:56 +0000 UTC" deadline="2027-12-27 01:07:43.300993139 +0000 UTC" Apr 21 07:02:57.433461 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.433439 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14754h4m45.8675577s" Apr 21 07:02:57.483610 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.483585 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:02:57.496546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2l2n\" (UniqueName: \"kubernetes.io/projected/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-kube-api-access-t2l2n\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.496697 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.496697 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkht\" (UniqueName: \"kubernetes.io/projected/e4d3a3ee-1584-42b6-a403-4bb39d451cab-kube-api-access-wwkht\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.496697 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-kubelet\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.496697 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-conf-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-netns\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-etc-selinux\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-netns\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-systemd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-etc-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-systemd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-etc-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-modprobe-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-etc-selinux\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.496898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-env-overrides\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-kubernetes\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-modprobe-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.496970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-socket-dir-parent\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-bin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-kubernetes\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-hosts-file\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-node-log\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-hosts-file\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-script-lib\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-conf\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-host\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497131 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-node-log\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-registration-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cnibin\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-var-lib-kubelet\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2tgr\" (UniqueName: \"kubernetes.io/projected/f64f9328-2e8e-457d-ab14-8b16c32be65a-kube-api-access-t2tgr\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.497469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-host\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-conf\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-env-overrides\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cnibin\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-registration-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85559148-4ea4-4bfd-8bf0-55be583da361-host\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-var-lib-kubelet\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-k8s-cni-cncf-io\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-script-lib\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-multus-certs\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-slash\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-host-slash\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-slash\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-host-slash\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.498376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-kubelet\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-os-release\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-kubelet\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.497975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-etc-kubernetes\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-tmp-dir\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-device-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-sys-fs\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k29n\" (UniqueName: \"kubernetes.io/projected/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-kube-api-access-2k29n\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-iptables-alerter-script\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovn-node-metrics-cert\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-sys-fs\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-device-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-tmp-dir\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.499200 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-system-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-binary-copy\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-bin\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cnibin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-systemd-units\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498586 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-log-socket\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-netd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-lib-modules\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85559148-4ea4-4bfd-8bf0-55be583da361-serviceca\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23dada4b-3bff-4763-9499-d08a34391b70-agent-certs\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-iptables-alerter-script\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtbh\" (UniqueName: \"kubernetes.io/projected/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kube-api-access-6dtbh\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-ovn\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-var-lib-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500032 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-os-release\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-tuned\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-log-socket\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.498983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-tmp\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-bin\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-multus\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-systemd-units\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6ml\" (UniqueName: \"kubernetes.io/projected/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-kube-api-access-jw6ml\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-run\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-os-release\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499207 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkpb\" (UniqueName: \"kubernetes.io/projected/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-kube-api-access-jkkpb\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysconfig\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-sys\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-cni-binary-copy\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-socket-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.500898 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-config\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-sys\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-run-ovn\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-lib-modules\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-var-lib-openvswitch\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-cni-netd\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-socket-dir\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysctl-d\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-sysconfig\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-run\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23dada4b-3bff-4763-9499-d08a34391b70-konnectivity-ca\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-netns\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-system-cni-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.499998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ws26\" (UniqueName: \"kubernetes.io/projected/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-kube-api-access-5ws26\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-systemd\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2zwp\" (UniqueName: \"kubernetes.io/projected/85559148-4ea4-4bfd-8bf0-55be583da361-kube-api-access-p2zwp\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.501694 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-system-cni-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cni-binary-copy\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-hostroot\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-daemon-config\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-systemd\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovnkube-config\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.502450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.500476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.502890 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.502680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-etc-tuned\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.503411 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.503370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f64f9328-2e8e-457d-ab14-8b16c32be65a-tmp\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.503559 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.503504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-ovn-node-metrics-cert\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.505577 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.505555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2l2n\" (UniqueName: \"kubernetes.io/projected/2f25cb44-1f59-45ee-8bd4-d80ef4c1366b-kube-api-access-t2l2n\") pod \"node-resolver-q7rfs\" (UID: \"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b\") " pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.511386 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.511347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2tgr\" (UniqueName: \"kubernetes.io/projected/f64f9328-2e8e-457d-ab14-8b16c32be65a-kube-api-access-t2tgr\") pod \"tuned-xxdx2\" (UID: \"f64f9328-2e8e-457d-ab14-8b16c32be65a\") " pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.512306 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.512270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k29n\" (UniqueName: \"kubernetes.io/projected/8ce849fd-7b86-4acc-b03c-5583cbf4cc68-kube-api-access-2k29n\") pod \"multus-additional-cni-plugins-rlt5l\" (UID: \"8ce849fd-7b86-4acc-b03c-5583cbf4cc68\") " pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.514103 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.514078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ws26\" (UniqueName: \"kubernetes.io/projected/2bf8d8f9-a085-4c41-8558-9fd3edcddb6f-kube-api-access-5ws26\") pod \"iptables-alerter-48872\" (UID: \"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f\") " pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.514103 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.514093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6ml\" (UniqueName: \"kubernetes.io/projected/9890b61f-81d9-4bd9-a0d8-9cbf41de4590-kube-api-access-jw6ml\") pod \"ovnkube-node-xt7hd\" (UID: \"9890b61f-81d9-4bd9-a0d8-9cbf41de4590\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.516754 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.516709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtbh\" (UniqueName: \"kubernetes.io/projected/c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4-kube-api-access-6dtbh\") pod \"aws-ebs-csi-driver-node-dps9l\" (UID: \"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.536895 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.536867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" event={"ID":"bce90b98ae5ab75e21e2fc6e3c6353e7","Type":"ContainerStarted","Data":"cfb441f2b62e2a350e13816a283dfd63319631ec68d0d01a5b197d6b8979a8fe"} Apr 21 07:02:57.600641 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-multus\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.600641 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.600860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkpb\" (UniqueName: \"kubernetes.io/projected/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-kube-api-access-jkkpb\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.600860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23dada4b-3bff-4763-9499-d08a34391b70-konnectivity-ca\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.600860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-netns\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.600860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-multus\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.600860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2zwp\" (UniqueName: \"kubernetes.io/projected/85559148-4ea4-4bfd-8bf0-55be583da361-kube-api-access-p2zwp\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.601066 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.600999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cni-binary-copy\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601066 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-hostroot\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-daemon-config\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-netns\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkht\" (UniqueName: \"kubernetes.io/projected/e4d3a3ee-1584-42b6-a403-4bb39d451cab-kube-api-access-wwkht\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-hostroot\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-kubelet\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-conf-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-socket-dir-parent\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-bin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/23dada4b-3bff-4763-9499-d08a34391b70-konnectivity-ca\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-kubelet\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85559148-4ea4-4bfd-8bf0-55be583da361-host\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-var-lib-cni-bin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85559148-4ea4-4bfd-8bf0-55be583da361-host\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-k8s-cni-cncf-io\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-socket-dir-parent\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-multus-certs\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-k8s-cni-cncf-io\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-os-release\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.601436 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-host-run-multus-certs\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601375 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-conf-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-os-release\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-etc-kubernetes\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-etc-kubernetes\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-system-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cnibin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85559148-4ea4-4bfd-8bf0-55be583da361-serviceca\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23dada4b-3bff-4763-9499-d08a34391b70-agent-certs\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-system-cni-dir\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.601653 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cni-binary-copy\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601698 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-cnibin\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.601741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-multus-daemon-config\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.601797 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:02:58.101744428 +0000 UTC m=+3.133795655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:57.602144 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.602101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85559148-4ea4-4bfd-8bf0-55be583da361-serviceca\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.604154 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.604128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/23dada4b-3bff-4763-9499-d08a34391b70-agent-certs\") pod \"konnectivity-agent-xn7rw\" (UID: \"23dada4b-3bff-4763-9499-d08a34391b70\") " pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.607588 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.607567 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:02:57.607588 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.607590 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:02:57.607731 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.607603 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:57.607731 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:57.607677 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:02:58.107659138 +0000 UTC m=+3.139710359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:57.609237 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.609207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkpb\" (UniqueName: \"kubernetes.io/projected/08ad7b6d-5db7-4175-a947-75d82fb3d9ef-kube-api-access-jkkpb\") pod \"multus-6krhs\" (UID: \"08ad7b6d-5db7-4175-a947-75d82fb3d9ef\") " pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.609723 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.609702 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2zwp\" (UniqueName: \"kubernetes.io/projected/85559148-4ea4-4bfd-8bf0-55be583da361-kube-api-access-p2zwp\") pod \"node-ca-zzsbr\" (UID: \"85559148-4ea4-4bfd-8bf0-55be583da361\") " pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.609831 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.609754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkht\" (UniqueName: \"kubernetes.io/projected/e4d3a3ee-1584-42b6-a403-4bb39d451cab-kube-api-access-wwkht\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:57.687160 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.687072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:02:57.697984 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.697962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" Apr 21 07:02:57.704723 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.704691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q7rfs" Apr 21 07:02:57.710754 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.710733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" Apr 21 07:02:57.719286 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.719268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" Apr 21 07:02:57.727809 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.727789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-48872" Apr 21 07:02:57.733330 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.733308 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:02:57.739924 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.739894 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zzsbr" Apr 21 07:02:57.745533 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.745502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6krhs" Apr 21 07:02:57.977168 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:57.977143 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:58.105064 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.105035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:58.105230 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.105199 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:58.105285 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.105274 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:02:59.105259984 +0000 UTC m=+4.137311195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:58.206256 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.206218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:58.206400 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.206356 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:02:58.206400 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.206374 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:02:58.206400 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.206388 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:58.206549 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.206452 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:02:59.206434516 +0000 UTC m=+4.238485724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:58.275037 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.274811 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf8d8f9_a085_4c41_8558_9fd3edcddb6f.slice/crio-e1f0ca4b4d3b181f6fedfd0ee14bf63d28e6ff31dbeee274707fdbf17a0934e3 WatchSource:0}: Error finding container e1f0ca4b4d3b181f6fedfd0ee14bf63d28e6ff31dbeee274707fdbf17a0934e3: Status 404 returned error can't find the container with id e1f0ca4b4d3b181f6fedfd0ee14bf63d28e6ff31dbeee274707fdbf17a0934e3 Apr 21 07:02:58.276635 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.276481 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9890b61f_81d9_4bd9_a0d8_9cbf41de4590.slice/crio-af7adf57e1b24ae35f7a966c02bb810a6f3334c46d03f9a2efd8dd1e53399f6b WatchSource:0}: Error finding container af7adf57e1b24ae35f7a966c02bb810a6f3334c46d03f9a2efd8dd1e53399f6b: Status 404 returned error can't find the container with id af7adf57e1b24ae35f7a966c02bb810a6f3334c46d03f9a2efd8dd1e53399f6b Apr 21 07:02:58.279881 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.279855 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23dada4b_3bff_4763_9499_d08a34391b70.slice/crio-36ef1f81c7dbba15e55d3c0ac89c615c8ee0fcbdf602472139d3cc595c573bff WatchSource:0}: Error finding container 36ef1f81c7dbba15e55d3c0ac89c615c8ee0fcbdf602472139d3cc595c573bff: Status 404 returned error can't find the container with id 36ef1f81c7dbba15e55d3c0ac89c615c8ee0fcbdf602472139d3cc595c573bff Apr 21 07:02:58.280500 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.280474 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f25cb44_1f59_45ee_8bd4_d80ef4c1366b.slice/crio-219eb369947c3812989c59f4e644c9c48846ba84df1d92a7c312d318327c7b6a WatchSource:0}: Error finding container 219eb369947c3812989c59f4e644c9c48846ba84df1d92a7c312d318327c7b6a: Status 404 returned error can't find the container with id 219eb369947c3812989c59f4e644c9c48846ba84df1d92a7c312d318327c7b6a Apr 21 07:02:58.281259 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.281235 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc146c4d5_3700_4a77_bb4c_1fb3a2b4a1c4.slice/crio-05d1ed57891309fa468d3974ed02ca058533ffc08d5d65dc31425086bd851c6b WatchSource:0}: Error finding container 05d1ed57891309fa468d3974ed02ca058533ffc08d5d65dc31425086bd851c6b: Status 404 returned error can't find the container with id 05d1ed57891309fa468d3974ed02ca058533ffc08d5d65dc31425086bd851c6b Apr 21 07:02:58.282997 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.282888 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08ad7b6d_5db7_4175_a947_75d82fb3d9ef.slice/crio-27bd32905a625401ec85599e275289b4baa5a2b44095e56ae66134b3d7a0f12b WatchSource:0}: Error finding container 27bd32905a625401ec85599e275289b4baa5a2b44095e56ae66134b3d7a0f12b: Status 404 returned error can't find the container with id 27bd32905a625401ec85599e275289b4baa5a2b44095e56ae66134b3d7a0f12b Apr 21 07:02:58.283662 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.283443 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85559148_4ea4_4bfd_8bf0_55be583da361.slice/crio-9bcb436636dfae2cbc4876cb3c7bdf7379d8d8b396923caf0f947ffdad4d9e30 WatchSource:0}: Error finding container 9bcb436636dfae2cbc4876cb3c7bdf7379d8d8b396923caf0f947ffdad4d9e30: Status 404 returned error can't find the container with id 9bcb436636dfae2cbc4876cb3c7bdf7379d8d8b396923caf0f947ffdad4d9e30 Apr 21 07:02:58.284302 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:02:58.284186 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce849fd_7b86_4acc_b03c_5583cbf4cc68.slice/crio-aad07169b1e0338426dc6b73715bda1dd8b6010f8a1926bc6a79550970788826 WatchSource:0}: Error finding container aad07169b1e0338426dc6b73715bda1dd8b6010f8a1926bc6a79550970788826: Status 404 returned error can't find the container with id aad07169b1e0338426dc6b73715bda1dd8b6010f8a1926bc6a79550970788826 Apr 21 07:02:58.426776 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.426747 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:02:58.433734 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.433709 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:57:56 +0000 UTC" deadline="2028-02-02 07:04:04.482179724 +0000 UTC" Apr 21 07:02:58.433734 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.433733 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15648h1m6.048448951s" Apr 21 07:02:58.531209 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.531142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:58.531320 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:58.531240 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:02:58.538907 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.538884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6krhs" event={"ID":"08ad7b6d-5db7-4175-a947-75d82fb3d9ef","Type":"ContainerStarted","Data":"27bd32905a625401ec85599e275289b4baa5a2b44095e56ae66134b3d7a0f12b"} Apr 21 07:02:58.539926 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.539901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" event={"ID":"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4","Type":"ContainerStarted","Data":"05d1ed57891309fa468d3974ed02ca058533ffc08d5d65dc31425086bd851c6b"} Apr 21 07:02:58.540837 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.540806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q7rfs" event={"ID":"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b","Type":"ContainerStarted","Data":"219eb369947c3812989c59f4e644c9c48846ba84df1d92a7c312d318327c7b6a"} Apr 21 07:02:58.541755 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.541732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"af7adf57e1b24ae35f7a966c02bb810a6f3334c46d03f9a2efd8dd1e53399f6b"} Apr 21 07:02:58.542782 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.542757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerStarted","Data":"aad07169b1e0338426dc6b73715bda1dd8b6010f8a1926bc6a79550970788826"} Apr 21 07:02:58.543852 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.543815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zzsbr" event={"ID":"85559148-4ea4-4bfd-8bf0-55be583da361","Type":"ContainerStarted","Data":"9bcb436636dfae2cbc4876cb3c7bdf7379d8d8b396923caf0f947ffdad4d9e30"} Apr 21 07:02:58.544711 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.544689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xn7rw" event={"ID":"23dada4b-3bff-4763-9499-d08a34391b70","Type":"ContainerStarted","Data":"36ef1f81c7dbba15e55d3c0ac89c615c8ee0fcbdf602472139d3cc595c573bff"} Apr 21 07:02:58.545797 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.545768 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-48872" event={"ID":"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f","Type":"ContainerStarted","Data":"e1f0ca4b4d3b181f6fedfd0ee14bf63d28e6ff31dbeee274707fdbf17a0934e3"} Apr 21 07:02:58.547271 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.547245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" event={"ID":"192bfeaa4c26d06d04fe2b9437ecbb37","Type":"ContainerStarted","Data":"7cbbed3eb307c925c1c0f1c808e3f151448d6f33f9fa9457b2b8c9c5ab7429a1"} Apr 21 07:02:58.548204 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.548181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" event={"ID":"f64f9328-2e8e-457d-ab14-8b16c32be65a","Type":"ContainerStarted","Data":"0ab5c2cfe1f4a8c8d26f904270b393ef38cad925973b87226ccdd0d611d3a0a3"} Apr 21 07:02:58.562149 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:58.562108 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-69.ec2.internal" podStartSLOduration=2.562097262 podStartE2EDuration="2.562097262s" podCreationTimestamp="2026-04-21 07:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:02:58.561663943 +0000 UTC m=+3.593715173" watchObservedRunningTime="2026-04-21 07:02:58.562097262 +0000 UTC m=+3.594148489" Apr 21 07:02:59.114133 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:59.114093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:59.114299 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.114254 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:59.114373 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.114320 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:01.11430082 +0000 UTC m=+6.146352032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:02:59.214853 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:59.214817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:02:59.215025 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.215008 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:02:59.215082 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.215034 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:02:59.215082 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.215046 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:59.215178 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.215100 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:01.21508335 +0000 UTC m=+6.247134556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:02:59.534404 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:59.533893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:02:59.534404 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:02:59.534036 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:02:59.565260 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:59.565216 2573 generic.go:358] "Generic (PLEG): container finished" podID="bce90b98ae5ab75e21e2fc6e3c6353e7" containerID="78cd6d8cf78a0aca2bf24275e88014324c7646d7333f7a9cba5c20035518f6b2" exitCode=0 Apr 21 07:02:59.565424 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:02:59.565366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" event={"ID":"bce90b98ae5ab75e21e2fc6e3c6353e7","Type":"ContainerDied","Data":"78cd6d8cf78a0aca2bf24275e88014324c7646d7333f7a9cba5c20035518f6b2"} Apr 21 07:03:00.531720 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:00.531690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:00.531888 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:00.531804 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:00.588600 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:00.588552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" event={"ID":"bce90b98ae5ab75e21e2fc6e3c6353e7","Type":"ContainerStarted","Data":"5b129139a74fc601d29859238698dd118652a960d6931d66da0a36d4688edaf3"} Apr 21 07:03:00.613149 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:00.613094 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-69.ec2.internal" podStartSLOduration=4.613076232 podStartE2EDuration="4.613076232s" podCreationTimestamp="2026-04-21 07:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:03:00.612617904 +0000 UTC m=+5.644669133" watchObservedRunningTime="2026-04-21 07:03:00.613076232 +0000 UTC m=+5.645127461" Apr 21 07:03:01.129790 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.129751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:01.129978 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.129925 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:01.130038 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.129988 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:05.129968427 +0000 UTC m=+10.162019640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:01.186570 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.185755 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mqxbs"] Apr 21 07:03:01.188688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.188664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.188817 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.188748 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:01.230607 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.230566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-kubelet-config\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.230787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.230644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-dbus\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.230787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.230686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.230787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.230738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:01.231014 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.230964 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:01.231014 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.230999 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:01.231014 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.231012 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:01.231200 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.231104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:05.231081375 +0000 UTC m=+10.263132587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.331695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-kubelet-config\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.331739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-dbus\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.331781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.331948 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.332006 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:01.831988849 +0000 UTC m=+6.864040059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.332251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-kubelet-config\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.332419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.332381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-dbus\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.531860 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.531754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:01.532040 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.531905 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:01.836432 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:01.836341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:01.836875 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.836484 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:01.836875 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:01.836565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:02.836546646 +0000 UTC m=+7.868597861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:02.532265 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:02.531752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:02.532265 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:02.531892 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:02.532265 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:02.532003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:02.532265 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:02.532115 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:02.844712 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:02.844624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:02.845173 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:02.844774 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:02.845173 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:02.844835 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:04.844818803 +0000 UTC m=+9.876870031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:03.534797 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:03.534326 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:03.534797 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:03.534457 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:04.531673 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:04.531608 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:04.532185 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:04.531731 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:04.532185 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:04.531800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:04.532185 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:04.531890 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:04.860681 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:04.860579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:04.860856 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:04.860739 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:04.860856 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:04.860814 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:08.860794447 +0000 UTC m=+13.892845665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:05.163616 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:05.163524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:05.163891 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.163671 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:05.163891 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.163746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:13.163725526 +0000 UTC m=+18.195776749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:05.264470 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:05.264429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:05.264662 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.264632 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:05.264662 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.264651 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:05.264662 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.264663 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:05.264832 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.264718 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:13.264699851 +0000 UTC m=+18.296751070 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:05.532227 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:05.532191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:05.532642 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:05.532325 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:06.531304 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:06.531268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:06.531304 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:06.531297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:06.531554 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:06.531378 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:06.531554 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:06.531521 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:07.531590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:07.531561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:07.532030 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:07.531702 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:08.531674 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:08.531647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:08.532107 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:08.531646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:08.532107 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:08.531763 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:08.532107 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:08.531865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:08.889751 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:08.889648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:08.889897 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:08.889810 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:08.889897 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:08.889890 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:16.889869113 +0000 UTC m=+21.921920333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:09.531697 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:09.531667 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:09.532138 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:09.531799 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:10.531374 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:10.531343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:10.531533 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:10.531343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:10.531533 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:10.531442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:10.531626 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:10.531530 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:11.532083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:11.532045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:11.532546 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:11.532194 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:12.531781 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:12.531746 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:12.531998 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:12.531748 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:12.531998 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:12.531859 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:12.531998 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:12.531945 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:13.221259 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:13.221214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:13.221739 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.221372 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:13.221739 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.221444 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.221429225 +0000 UTC m=+34.253480431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:13.321904 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:13.321862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:13.322068 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.322000 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:13.322068 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.322015 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:13.322068 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.322024 2573 projected.go:194] Error preparing data for projected volume kube-api-access-shz7s for pod openshift-network-diagnostics/network-check-target-qv6dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:13.322207 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.322074 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s podName:5affaa81-79dd-4de7-85b9-98182a2406f0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.322056929 +0000 UTC m=+34.354108135 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz7s" (UniqueName: "kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s") pod "network-check-target-qv6dn" (UID: "5affaa81-79dd-4de7-85b9-98182a2406f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:13.531909 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:13.531878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:13.532047 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:13.531985 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:14.531873 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:14.531838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:14.532255 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:14.531855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:14.532255 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:14.531971 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:14.532255 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:14.532031 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:15.535605 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:15.533801 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:15.535605 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:15.533953 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:15.629145 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:15.628628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"77d1545fa5259188ec0df55136b4797f41e1187ea7faf429a32ddacd321a385f"} Apr 21 07:03:15.636813 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:15.636775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" event={"ID":"f64f9328-2e8e-457d-ab14-8b16c32be65a","Type":"ContainerStarted","Data":"d468c2b28e530782b066d8319e1b39ca8ccb9ddd1d192489ad76ef1156bbb049"} Apr 21 07:03:16.531416 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.531255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:16.531590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.531254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:16.531590 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:16.531521 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:16.531590 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:16.531555 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:16.641207 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.641174 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="06aa8e5848eeabacdac0743ba302f7fe2143bb7d55ca22402e9009490c27f597" exitCode=0 Apr 21 07:03:16.641894 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.641261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"06aa8e5848eeabacdac0743ba302f7fe2143bb7d55ca22402e9009490c27f597"} Apr 21 07:03:16.643295 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.643187 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zzsbr" event={"ID":"85559148-4ea4-4bfd-8bf0-55be583da361","Type":"ContainerStarted","Data":"baed9ae85011d75a2d19275c66c6d1dc50e2233368777e556db4adebdc0cd27c"} Apr 21 07:03:16.644916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.644880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xn7rw" event={"ID":"23dada4b-3bff-4763-9499-d08a34391b70","Type":"ContainerStarted","Data":"1b5b7ce57ba47607f6dc73beff93184787f4734177bd0923859abf7bc3715810"} Apr 21 07:03:16.646228 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.646208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6krhs" event={"ID":"08ad7b6d-5db7-4175-a947-75d82fb3d9ef","Type":"ContainerStarted","Data":"2ca40785bd44da73055ea53e8bfeaf101aec227ddb41d0f37324c8b7618912df"} Apr 21 07:03:16.647795 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.647775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" event={"ID":"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4","Type":"ContainerStarted","Data":"c6877014f6377e9329ed59e4fe365ee6d517c6e01aaa93a21f8c3f6aee551c84"} Apr 21 07:03:16.649202 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.649180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q7rfs" event={"ID":"2f25cb44-1f59-45ee-8bd4-d80ef4c1366b","Type":"ContainerStarted","Data":"7cfead9556700ff8e3027cedd912e6239e0619ee90bc0dc3520e73d8b93319cb"} Apr 21 07:03:16.652220 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.652194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"232bbcc65f71588594883c22077185737a54833551be2036189c393702af211b"} Apr 21 07:03:16.652300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.652228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"2574453f1bdb93345285bd07e5ec33323d27cc85880c027b43084204a32bda9d"} Apr 21 07:03:16.652300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.652241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"f20c4a5f6884583e1294007a1c915f32caa5947bbec33ec0dc5b068d69518a0f"} Apr 21 07:03:16.652300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.652256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"419b362b6d4c472c3f0733a3a18b237daf3b38c08715a2d94fd47298c5bd4884"} Apr 21 07:03:16.652300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.652269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"a58a243b48a06c75f3c2c68d06f23d65747d6774ed7d599ac486e1260fdca468"} Apr 21 07:03:16.666180 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.666126 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xxdx2" podStartSLOduration=4.586836788 podStartE2EDuration="21.666110663s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.310898544 +0000 UTC m=+3.342949749" lastFinishedPulling="2026-04-21 07:03:15.390172411 +0000 UTC m=+20.422223624" observedRunningTime="2026-04-21 07:03:15.654967685 +0000 UTC m=+20.687018914" watchObservedRunningTime="2026-04-21 07:03:16.666110663 +0000 UTC m=+21.698161892" Apr 21 07:03:16.679185 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.679151 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q7rfs" podStartSLOduration=4.571376848 podStartE2EDuration="21.679141009s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.282251945 +0000 UTC m=+3.314303151" lastFinishedPulling="2026-04-21 07:03:15.390016093 +0000 UTC m=+20.422067312" observedRunningTime="2026-04-21 07:03:16.679003845 +0000 UTC m=+21.711055072" watchObservedRunningTime="2026-04-21 07:03:16.679141009 +0000 UTC m=+21.711192237" Apr 21 07:03:16.692604 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.692575 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xn7rw" podStartSLOduration=4.586767731 podStartE2EDuration="21.692565177s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.282611027 +0000 UTC m=+3.314662232" lastFinishedPulling="2026-04-21 07:03:15.388408463 +0000 UTC m=+20.420459678" observedRunningTime="2026-04-21 07:03:16.692250638 +0000 UTC m=+21.724301867" watchObservedRunningTime="2026-04-21 07:03:16.692565177 +0000 UTC m=+21.724616405" Apr 21 07:03:16.723994 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.723967 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zzsbr" podStartSLOduration=9.451838194 podStartE2EDuration="21.723958138s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.30463118 +0000 UTC m=+3.336682385" lastFinishedPulling="2026-04-21 07:03:10.576751119 +0000 UTC m=+15.608802329" observedRunningTime="2026-04-21 07:03:16.723950692 +0000 UTC m=+21.756001919" watchObservedRunningTime="2026-04-21 07:03:16.723958138 +0000 UTC m=+21.756009366" Apr 21 07:03:16.724080 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.724021 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6krhs" podStartSLOduration=4.541828257 podStartE2EDuration="21.724016899s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.305244664 +0000 UTC m=+3.337295870" lastFinishedPulling="2026-04-21 07:03:15.487433295 +0000 UTC m=+20.519484512" observedRunningTime="2026-04-21 07:03:16.707989902 +0000 UTC m=+21.740041142" watchObservedRunningTime="2026-04-21 07:03:16.724016899 +0000 UTC m=+21.756068127" Apr 21 07:03:16.756198 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.756176 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:03:16.948662 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:16.948592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:16.948796 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:16.948747 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:16.948859 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:16.948829 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret podName:240ddbe6-7b0f-4f03-9c28-38b3756ea88b nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.948808628 +0000 UTC m=+37.980859840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret") pod "global-pull-secret-syncer-mqxbs" (UID: "240ddbe6-7b0f-4f03-9c28-38b3756ea88b") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:03:17.449886 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.449777 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:03:16.756192225Z","UUID":"69cdc549-9b1f-4f88-9110-2295695b5dcb","Handler":null,"Name":"","Endpoint":""} Apr 21 07:03:17.451380 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.451357 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:03:17.451380 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.451383 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:03:17.532075 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.532048 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:17.532246 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:17.532191 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:17.655271 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.655235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-48872" event={"ID":"2bf8d8f9-a085-4c41-8558-9fd3edcddb6f","Type":"ContainerStarted","Data":"e41068a3ac9535d045220f481c9a5ec9c3197ceb448fbfc8e19665f1616e4798"} Apr 21 07:03:17.657218 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.657190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" event={"ID":"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4","Type":"ContainerStarted","Data":"cf9a3e269512f08002c11767c7f7cebd87aee87b601e4ba7fede69e018b56eb6"} Apr 21 07:03:17.670474 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:17.670415 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-48872" podStartSLOduration=10.37054707 podStartE2EDuration="22.670398992s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.276853457 +0000 UTC m=+3.308904675" lastFinishedPulling="2026-04-21 07:03:10.576705386 +0000 UTC m=+15.608756597" observedRunningTime="2026-04-21 07:03:17.669853526 +0000 UTC m=+22.701904753" watchObservedRunningTime="2026-04-21 07:03:17.670398992 +0000 UTC m=+22.702450222" Apr 21 07:03:18.272127 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.272051 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:03:18.272702 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.272680 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:03:18.531427 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.531350 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:18.531427 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.531378 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:18.531707 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:18.531456 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:18.531707 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:18.531616 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:18.661403 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.661364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" event={"ID":"c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4","Type":"ContainerStarted","Data":"4855fde6d40065b868aeffb95a017a2a895f05024a5d0ec1138d9720e9b8d6a0"} Apr 21 07:03:18.664585 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.664554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"057a2c5713d45d30cee242d97f250e3c6bd82d6ffbd68da4ed737d142b6be1b0"} Apr 21 07:03:18.678762 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:18.678712 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dps9l" podStartSLOduration=4.277648636 podStartE2EDuration="23.678696845s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.283174937 +0000 UTC m=+3.315226149" lastFinishedPulling="2026-04-21 07:03:17.684223147 +0000 UTC m=+22.716274358" observedRunningTime="2026-04-21 07:03:18.67834464 +0000 UTC m=+23.710395869" watchObservedRunningTime="2026-04-21 07:03:18.678696845 +0000 UTC m=+23.710748076" Apr 21 07:03:19.531295 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:19.531078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:19.531451 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:19.531404 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:19.666258 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:19.666234 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:20.531185 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:20.531158 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:20.531330 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:20.531288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:20.531402 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:20.531348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:20.531486 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:20.531463 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:21.531839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.531674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:21.532443 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:21.531934 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:21.672199 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.672162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" event={"ID":"9890b61f-81d9-4bd9-a0d8-9cbf41de4590","Type":"ContainerStarted","Data":"0c2140b442bdecdbfca1097e37a386a8ea62463104f8026010f77064ea8a75e7"} Apr 21 07:03:21.672465 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.672443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:03:21.673828 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.673804 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="51722ab7a0566fc1b1c5b889b39f8854ca63e220ad244834ba901a16f8b838f7" exitCode=0 Apr 21 07:03:21.673900 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.673840 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"51722ab7a0566fc1b1c5b889b39f8854ca63e220ad244834ba901a16f8b838f7"} Apr 21 07:03:21.687112 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.687094 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:03:21.708109 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.708068 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" podStartSLOduration=9.55571556 podStartE2EDuration="26.708058141s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.278700611 +0000 UTC m=+3.310751824" lastFinishedPulling="2026-04-21 07:03:15.431043184 +0000 UTC m=+20.463094405" observedRunningTime="2026-04-21 07:03:21.707645714 +0000 UTC m=+26.739696943" watchObservedRunningTime="2026-04-21 07:03:21.708058141 +0000 UTC m=+26.740109369" Apr 21 07:03:21.773803 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.773778 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:03:21.773953 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.773886 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:21.774272 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:21.774253 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xn7rw" Apr 21 07:03:22.504892 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.504866 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:03:22.532024 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.531962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:22.532329 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.531962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:22.532329 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:22.532073 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:22.532329 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:22.532126 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:22.677159 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.677123 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="9a72e1252a6ce06e067f231d0d8c480ed6e93364283c2c000f3a18c2d383bd44" exitCode=0 Apr 21 07:03:22.677294 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.677215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"9a72e1252a6ce06e067f231d0d8c480ed6e93364283c2c000f3a18c2d383bd44"} Apr 21 07:03:22.678022 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.677976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:03:22.692666 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.692645 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:03:22.721043 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.721018 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qv6dn"] Apr 21 07:03:22.721142 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.721127 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:22.721252 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:22.721230 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:22.721854 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.721833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qzgxt"] Apr 21 07:03:22.721941 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.721925 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:22.722015 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:22.721999 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:22.723196 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.723175 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mqxbs"] Apr 21 07:03:22.723285 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:22.723256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:22.723334 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:22.723317 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:23.680331 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:23.680303 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="b02358b86d485fd6d47333fe753908eb0ab8d272ddc61beca1a181a0e718e934" exitCode=0 Apr 21 07:03:23.680698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:23.680387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"b02358b86d485fd6d47333fe753908eb0ab8d272ddc61beca1a181a0e718e934"} Apr 21 07:03:24.531354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:24.531322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:24.531677 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:24.531322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:24.531677 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:24.531470 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:24.531677 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:24.531547 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:24.531677 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:24.531663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:24.531893 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:24.531763 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:26.531986 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:26.531937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:26.532467 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:26.531939 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:26.532467 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:26.532077 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqxbs" podUID="240ddbe6-7b0f-4f03-9c28-38b3756ea88b" Apr 21 07:03:26.532467 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:26.531937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:26.532467 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:26.532170 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qzgxt" podUID="e4d3a3ee-1584-42b6-a403-4bb39d451cab" Apr 21 07:03:26.532467 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:26.532211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv6dn" podUID="5affaa81-79dd-4de7-85b9-98182a2406f0" Apr 21 07:03:28.299283 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.299250 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-69.ec2.internal" event="NodeReady" Apr 21 07:03:28.299739 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.299405 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:03:28.350077 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.349484 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4gb9q"] Apr 21 07:03:28.371526 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.370651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:03:28.371526 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.370761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.373908 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.373884 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 07:03:28.374491 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.374454 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 07:03:28.374491 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.374480 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-mkb5t\"" Apr 21 07:03:28.375192 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.375008 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.375560 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.375542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.382539 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.382496 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 07:03:28.386247 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.386215 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb"] Apr 21 07:03:28.386473 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.386450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.395729 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.395705 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wwtlv\"" Apr 21 07:03:28.396045 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.396027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:03:28.396135 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.396074 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:03:28.396465 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.396444 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:03:28.399300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.399275 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:03:28.417613 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.417586 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w9nvv"] Apr 21 07:03:28.439965 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.439936 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r"] Apr 21 07:03:28.440092 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.440049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" Apr 21 07:03:28.440324 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.440303 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.444290 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.444267 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.446576 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.446494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 07:03:28.446827 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.446805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 07:03:28.446827 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.446820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.446966 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.446839 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qxpw5\"" Apr 21 07:03:28.447198 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.447179 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-rr6fc\"" Apr 21 07:03:28.447296 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.447200 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.456956 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.456935 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.457067 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.456975 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 07:03:28.461536 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.461503 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m"] Apr 21 07:03:28.461676 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.461653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.466590 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.466568 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 07:03:28.466738 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.466595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.466738 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.466598 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 07:03:28.466738 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.466613 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zvxnz\"" Apr 21 07:03:28.466897 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.466880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.484964 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.484921 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4gb9q"] Apr 21 07:03:28.484964 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.484954 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-57f7f9fd66-mtt95"] Apr 21 07:03:28.485173 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.485090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.487605 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.487418 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 07:03:28.487605 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.487448 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.487605 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.487483 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-926tf\"" Apr 21 07:03:28.487826 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.487764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.496966 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.496950 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w9nvv"] Apr 21 07:03:28.497072 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.496971 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4"] Apr 21 07:03:28.497131 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.497097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.500205 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500187 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 07:03:28.500298 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500252 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 07:03:28.500476 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 07:03:28.500575 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500522 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.500575 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500534 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gssqw\"" Apr 21 07:03:28.500679 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500641 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 07:03:28.500778 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.500762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.511239 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.511218 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr"] Apr 21 07:03:28.511404 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.511381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.515886 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.515862 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.516365 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.516347 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.517064 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.517047 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 07:03:28.517156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.517077 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-phcjm\"" Apr 21 07:03:28.517156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.517082 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 07:03:28.529415 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.529394 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2"] Apr 21 07:03:28.529572 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.529552 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.532563 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.532543 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 07:03:28.532663 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.532623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pftx5\"" Apr 21 07:03:28.533824 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.533804 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 07:03:28.535305 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535408 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535408 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535408 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535607 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535607 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.535607 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qf6l\" (UniqueName: \"kubernetes.io/projected/71f6c66b-8c36-497f-9098-f070725c4d1d-kube-api-access-5qf6l\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.535720 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-serving-cert\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.535720 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-trusted-ca\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.535720 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44af391c-8f7a-471b-a4eb-25f3b5519c86-serving-cert\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.535826 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjs9m\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535826 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535826 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-config\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.535962 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-snapshots\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.535962 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71f6c66b-8c36-497f-9098-f070725c4d1d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.535962 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.535962 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npv95\" (UniqueName: \"kubernetes.io/projected/44af391c-8f7a-471b-a4eb-25f3b5519c86-kube-api-access-npv95\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.536289 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndd4\" (UniqueName: \"kubernetes.io/projected/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-kube-api-access-tndd4\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.536289 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.535991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-service-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.536289 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.536014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-tmp\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.536289 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.536069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.536289 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.536133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnlgd\" (UniqueName: \"kubernetes.io/projected/8da13794-b67d-4df5-9370-57ce6358959a-kube-api-access-qnlgd\") pod \"volume-data-source-validator-7c6cbb6c87-9kqbb\" (UID: \"8da13794-b67d-4df5-9370-57ce6358959a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" Apr 21 07:03:28.540044 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.540027 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r"] Apr 21 07:03:28.540123 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.540054 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps"] Apr 21 07:03:28.540190 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.540171 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.540439 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.540413 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:28.540687 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.540649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:28.542764 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.542643 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.542764 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.542643 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.542764 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.542691 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.542962 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.542795 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 07:03:28.543099 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.543080 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.543099 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.543097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 07:03:28.543214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.543120 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:03:28.543214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.543122 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-nxb2w\"" Apr 21 07:03:28.543295 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.543220 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vntj\"" Apr 21 07:03:28.556091 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556071 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556101 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556117 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556130 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57f7f9fd66-mtt95"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556141 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556153 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m"] Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556158 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" Apr 21 07:03:28.556178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556167 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vrfxd"] Apr 21 07:03:28.556469 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.556161 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:28.558751 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.558647 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-g4w77\"" Apr 21 07:03:28.558751 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.558741 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5rqg2\"" Apr 21 07:03:28.558896 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.558842 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:03:28.571313 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.571206 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wknz9"] Apr 21 07:03:28.571470 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.571453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.573965 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.573711 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:03:28.573965 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.573869 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-f8s28\"" Apr 21 07:03:28.574117 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.573970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:03:28.574117 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.573986 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:03:28.583392 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.583361 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb"] Apr 21 07:03:28.583486 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.583397 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vrfxd"] Apr 21 07:03:28.583486 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.583410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps"] Apr 21 07:03:28.583486 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.583422 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wknz9"] Apr 21 07:03:28.583618 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.583571 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.587611 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.586932 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:03:28.587611 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.587167 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:03:28.587611 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.587538 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8pbf\"" Apr 21 07:03:28.636526 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de46750f-df1b-4469-a3bd-4300d5fa0f79-config\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzjh\" (UniqueName: \"kubernetes.io/projected/de46750f-df1b-4469-a3bd-4300d5fa0f79-kube-api-access-4wzjh\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-tmp\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.636688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.636767 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-serving-cert\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.636823 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.136805202 +0000 UTC m=+34.168856414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-default-certificate\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8688\" (UniqueName: \"kubernetes.io/projected/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-kube-api-access-h8688\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qf6l\" (UniqueName: \"kubernetes.io/projected/71f6c66b-8c36-497f-9098-f070725c4d1d-kube-api-access-5qf6l\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.636971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.636980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd0f8ab2-d283-4079-8826-80cb40b62cab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44af391c-8f7a-471b-a4eb-25f3b5519c86-serving-cert\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9xx\" (UniqueName: \"kubernetes.io/projected/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-kube-api-access-mw9xx\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnlgd\" (UniqueName: \"kubernetes.io/projected/8da13794-b67d-4df5-9370-57ce6358959a-kube-api-access-qnlgd\") pod \"volume-data-source-validator-7c6cbb6c87-9kqbb\" (UID: \"8da13794-b67d-4df5-9370-57ce6358959a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" Apr 21 07:03:28.637393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjhr2\" (UniqueName: \"kubernetes.io/projected/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-kube-api-access-jjhr2\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.637866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjs9m\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.637866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npv95\" (UniqueName: \"kubernetes.io/projected/44af391c-8f7a-471b-a4eb-25f3b5519c86-kube-api-access-npv95\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.637866 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.637493 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:28.637866 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.637532 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:28.637866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.637492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-tmp\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.638124 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.638040 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.138019921 +0000 UTC m=+34.170071142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:28.638124 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71f6c66b-8c36-497f-9098-f070725c4d1d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.638243 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de46750f-df1b-4469-a3bd-4300d5fa0f79-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.638243 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-config\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.638243 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.638243 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-snapshots\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.638629 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.638629 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85lj\" (UniqueName: \"kubernetes.io/projected/bb10b08b-2c33-4546-889b-697fd8825b2f-kube-api-access-h85lj\") pod \"network-check-source-8894fc9bd-qpjps\" (UID: \"bb10b08b-2c33-4546-889b-697fd8825b2f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" Apr 21 07:03:28.638870 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/44af391c-8f7a-471b-a4eb-25f3b5519c86-snapshots\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.638870 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndd4\" (UniqueName: \"kubernetes.io/projected/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-kube-api-access-tndd4\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.638870 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-service-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.639091 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-config\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.639091 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71f6c66b-8c36-497f-9098-f070725c4d1d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.639091 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.638900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.639234 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-stats-auth\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.639234 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.639340 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.639340 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.639340 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-trusted-ca\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.639479 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.639571 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.639627 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-service-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.639627 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.639551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.640148 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.640108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44af391c-8f7a-471b-a4eb-25f3b5519c86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.640250 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.640228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-trusted-ca\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.641899 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.641877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44af391c-8f7a-471b-a4eb-25f3b5519c86-serving-cert\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.641973 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.641910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.641973 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.641920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-serving-cert\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.641973 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.641961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.647215 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.647192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qf6l\" (UniqueName: \"kubernetes.io/projected/71f6c66b-8c36-497f-9098-f070725c4d1d-kube-api-access-5qf6l\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:28.648661 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.648634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.649241 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.649211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjs9m\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:28.649493 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.649467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnlgd\" (UniqueName: \"kubernetes.io/projected/8da13794-b67d-4df5-9370-57ce6358959a-kube-api-access-qnlgd\") pod \"volume-data-source-validator-7c6cbb6c87-9kqbb\" (UID: \"8da13794-b67d-4df5-9370-57ce6358959a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" Apr 21 07:03:28.649982 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.649951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndd4\" (UniqueName: \"kubernetes.io/projected/22ef3159-4fb3-4a8b-8264-e9ee14be3a04-kube-api-access-tndd4\") pod \"console-operator-9d4b6777b-4gb9q\" (UID: \"22ef3159-4fb3-4a8b-8264-e9ee14be3a04\") " pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.650949 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.650926 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npv95\" (UniqueName: \"kubernetes.io/projected/44af391c-8f7a-471b-a4eb-25f3b5519c86-kube-api-access-npv95\") pod \"insights-operator-585dfdc468-w9nvv\" (UID: \"44af391c-8f7a-471b-a4eb-25f3b5519c86\") " pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.684270 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.684220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:28.739950 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.739905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.739950 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.739946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de46750f-df1b-4469-a3bd-4300d5fa0f79-config\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.740147 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4519c586-5721-4cb3-bffc-7f4b13237ef7-tmp-dir\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.740204 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.740204 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzjh\" (UniqueName: \"kubernetes.io/projected/de46750f-df1b-4469-a3bd-4300d5fa0f79-kube-api-access-4wzjh\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.740296 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nxpp\" (UniqueName: \"kubernetes.io/projected/0816ede2-8af6-41c9-b423-5c313bc38315-kube-api-access-9nxpp\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.740296 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4519c586-5721-4cb3-bffc-7f4b13237ef7-config-volume\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.740296 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.740282 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.240259847 +0000 UTC m=+34.272311054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:28.740422 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-default-certificate\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.740422 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8688\" (UniqueName: \"kubernetes.io/projected/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-kube-api-access-h8688\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.740494 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd0f8ab2-d283-4079-8826-80cb40b62cab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.740494 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de46750f-df1b-4469-a3bd-4300d5fa0f79-config\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.740494 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.740660 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9xx\" (UniqueName: \"kubernetes.io/projected/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-kube-api-access-mw9xx\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.740660 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjhr2\" (UniqueName: \"kubernetes.io/projected/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-kube-api-access-jjhr2\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.740660 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de46750f-df1b-4469-a3bd-4300d5fa0f79-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.740810 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h85lj\" (UniqueName: \"kubernetes.io/projected/bb10b08b-2c33-4546-889b-697fd8825b2f-kube-api-access-h85lj\") pod \"network-check-source-8894fc9bd-qpjps\" (UID: \"bb10b08b-2c33-4546-889b-697fd8825b2f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" Apr 21 07:03:28.740810 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.740810 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.740751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-stats-auth\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.740810 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.740782 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:28.740993 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.740824 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.240809651 +0000 UTC m=+34.272860863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:28.741059 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.741152 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.741228 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.741228 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dgp4\" (UniqueName: \"kubernetes.io/projected/4519c586-5721-4cb3-bffc-7f4b13237ef7-kube-api-access-7dgp4\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.741228 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.741375 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.741240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd0f8ab2-d283-4079-8826-80cb40b62cab-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.742234 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.742177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:28.742686 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.742561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.742686 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.742587 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:28.742686 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.742656 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.242640239 +0000 UTC m=+34.274691458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:28.742950 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.742863 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:28.743257 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.743224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de46750f-df1b-4469-a3bd-4300d5fa0f79-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.743342 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.743310 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.243293978 +0000 UTC m=+34.275345190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:28.746257 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.743626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-default-certificate\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.746257 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.744569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-stats-auth\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.749920 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.749893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8688\" (UniqueName: \"kubernetes.io/projected/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-kube-api-access-h8688\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:28.750600 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.750556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzjh\" (UniqueName: \"kubernetes.io/projected/de46750f-df1b-4469-a3bd-4300d5fa0f79-kube-api-access-4wzjh\") pod \"service-ca-operator-d6fc45fc5-hq6q2\" (UID: \"de46750f-df1b-4469-a3bd-4300d5fa0f79\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.750600 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.750584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85lj\" (UniqueName: \"kubernetes.io/projected/bb10b08b-2c33-4546-889b-697fd8825b2f-kube-api-access-h85lj\") pod \"network-check-source-8894fc9bd-qpjps\" (UID: \"bb10b08b-2c33-4546-889b-697fd8825b2f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" Apr 21 07:03:28.750737 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.750616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjhr2\" (UniqueName: \"kubernetes.io/projected/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-kube-api-access-jjhr2\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:28.751140 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.751120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9xx\" (UniqueName: \"kubernetes.io/projected/c2986c84-0eaa-4d7a-a7c4-5337ab7f4875-kube-api-access-mw9xx\") pod \"kube-storage-version-migrator-operator-6769c5d45-4pwc4\" (UID: \"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.752776 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.752748 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" Apr 21 07:03:28.757977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.757957 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" Apr 21 07:03:28.822430 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.822353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" Apr 21 07:03:28.843576 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nxpp\" (UniqueName: \"kubernetes.io/projected/0816ede2-8af6-41c9-b423-5c313bc38315-kube-api-access-9nxpp\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.843731 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4519c586-5721-4cb3-bffc-7f4b13237ef7-config-volume\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.843795 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.843795 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dgp4\" (UniqueName: \"kubernetes.io/projected/4519c586-5721-4cb3-bffc-7f4b13237ef7-kube-api-access-7dgp4\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.843889 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.843889 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.843884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4519c586-5721-4cb3-bffc-7f4b13237ef7-tmp-dir\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.844035 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.843897 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:28.844035 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.843964 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.343942018 +0000 UTC m=+34.375993247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:28.844146 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.844120 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:28.844203 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:28.844189 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:29.344172778 +0000 UTC m=+34.376223986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:28.844269 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.844240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4519c586-5721-4cb3-bffc-7f4b13237ef7-tmp-dir\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.844321 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.844262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4519c586-5721-4cb3-bffc-7f4b13237ef7-config-volume\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.851554 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.851527 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" Apr 21 07:03:28.855131 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.855109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dgp4\" (UniqueName: \"kubernetes.io/projected/4519c586-5721-4cb3-bffc-7f4b13237ef7-kube-api-access-7dgp4\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:28.855399 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.855381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nxpp\" (UniqueName: \"kubernetes.io/projected/0816ede2-8af6-41c9-b423-5c313bc38315-kube-api-access-9nxpp\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:28.874475 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:28.874452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" Apr 21 07:03:29.147581 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.147487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:29.147581 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.147569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:29.147850 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.147644 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:29.147850 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.147714 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.147692314 +0000 UTC m=+35.179743520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:29.147850 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.147730 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:29.147850 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.147749 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:29.147850 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.147803 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.147784715 +0000 UTC m=+35.179835923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:29.248388 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.248351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:29.248602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.248416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:29.248602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.248485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:29.248602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248525 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:29.248602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248554 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:29.248602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248596 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248616 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.248531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248598 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.24857956 +0000 UTC m=+35.280630781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248674 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.248656617 +0000 UTC m=+35.280707828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248691 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs podName:e4d3a3ee-1584-42b6-a403-4bb39d451cab nodeName:}" failed. No retries permitted until 2026-04-21 07:04:01.248682831 +0000 UTC m=+66.280734038 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs") pod "network-metrics-daemon-qzgxt" (UID: "e4d3a3ee-1584-42b6-a403-4bb39d451cab") : secret "metrics-daemon-secret" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248724 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.248711938 +0000 UTC m=+35.280763151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:29.248840 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.248770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:29.249165 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.248896 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.248885492 +0000 UTC m=+35.280936698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:29.349586 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.349552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:29.349952 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.349722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:29.349952 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.349758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:29.349952 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.349844 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:29.349952 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.349897 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.349879364 +0000 UTC m=+35.381930577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:29.349952 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.349921 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:29.350149 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:29.349969 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:30.349956918 +0000 UTC m=+35.382008129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:29.351902 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.351887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shz7s\" (UniqueName: \"kubernetes.io/projected/5affaa81-79dd-4de7-85b9-98182a2406f0-kube-api-access-shz7s\") pod \"network-check-target-qv6dn\" (UID: \"5affaa81-79dd-4de7-85b9-98182a2406f0\") " pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:29.459549 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.459495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:29.621309 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.621277 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4"] Apr 21 07:03:29.625402 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.625377 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4gb9q"] Apr 21 07:03:29.629850 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.629826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb"] Apr 21 07:03:29.639080 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.639051 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps"] Apr 21 07:03:29.641418 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.641397 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2"] Apr 21 07:03:29.655139 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.655085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qv6dn"] Apr 21 07:03:29.656628 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:29.656595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w9nvv"] Apr 21 07:03:29.692671 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.692645 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2986c84_0eaa_4d7a_a7c4_5337ab7f4875.slice/crio-89d6372b35af7a47b9d011bbb40c05531b4511eeeff894af0173549caa828e41 WatchSource:0}: Error finding container 89d6372b35af7a47b9d011bbb40c05531b4511eeeff894af0173549caa828e41: Status 404 returned error can't find the container with id 89d6372b35af7a47b9d011bbb40c05531b4511eeeff894af0173549caa828e41 Apr 21 07:03:29.693339 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.693312 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ef3159_4fb3_4a8b_8264_e9ee14be3a04.slice/crio-edfd133f048d9bb1e03c520466775782dea98671935d8d31a22e6cee2d596700 WatchSource:0}: Error finding container edfd133f048d9bb1e03c520466775782dea98671935d8d31a22e6cee2d596700: Status 404 returned error can't find the container with id edfd133f048d9bb1e03c520466775782dea98671935d8d31a22e6cee2d596700 Apr 21 07:03:29.694144 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.694123 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8da13794_b67d_4df5_9370_57ce6358959a.slice/crio-6c2cf6b8235b68bcc6f3a148a2b00f30f3d0b3fffb6c51d08ad5793bc31fb331 WatchSource:0}: Error finding container 6c2cf6b8235b68bcc6f3a148a2b00f30f3d0b3fffb6c51d08ad5793bc31fb331: Status 404 returned error can't find the container with id 6c2cf6b8235b68bcc6f3a148a2b00f30f3d0b3fffb6c51d08ad5793bc31fb331 Apr 21 07:03:29.695606 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.695570 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb10b08b_2c33_4546_889b_697fd8825b2f.slice/crio-8721d59bffd350a38cc0742fc8b454983ba58b018e76916893a5920614e4a77d WatchSource:0}: Error finding container 8721d59bffd350a38cc0742fc8b454983ba58b018e76916893a5920614e4a77d: Status 404 returned error can't find the container with id 8721d59bffd350a38cc0742fc8b454983ba58b018e76916893a5920614e4a77d Apr 21 07:03:29.698211 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.697715 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde46750f_df1b_4469_a3bd_4300d5fa0f79.slice/crio-6a30ba2e55695f24fb3a8effd3e70c843bef7db7d391c637f1d6e0d266cdd658 WatchSource:0}: Error finding container 6a30ba2e55695f24fb3a8effd3e70c843bef7db7d391c637f1d6e0d266cdd658: Status 404 returned error can't find the container with id 6a30ba2e55695f24fb3a8effd3e70c843bef7db7d391c637f1d6e0d266cdd658 Apr 21 07:03:29.699284 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.699221 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5affaa81_79dd_4de7_85b9_98182a2406f0.slice/crio-bee245fb09f72e633c866fc7b6873f5e31e3331104677f6a67f0da8a37fb8ddd WatchSource:0}: Error finding container bee245fb09f72e633c866fc7b6873f5e31e3331104677f6a67f0da8a37fb8ddd: Status 404 returned error can't find the container with id bee245fb09f72e633c866fc7b6873f5e31e3331104677f6a67f0da8a37fb8ddd Apr 21 07:03:29.699925 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:03:29.699904 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44af391c_8f7a_471b_a4eb_25f3b5519c86.slice/crio-f3f57b53748e4d6e204b692079a0446ff264ba01c4af32dfc6de9370a966c19f WatchSource:0}: Error finding container f3f57b53748e4d6e204b692079a0446ff264ba01c4af32dfc6de9370a966c19f: Status 404 returned error can't find the container with id f3f57b53748e4d6e204b692079a0446ff264ba01c4af32dfc6de9370a966c19f Apr 21 07:03:30.158216 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.158042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.158234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.158184 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.158336 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.158320711 +0000 UTC m=+37.190371917 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.158358 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.158373 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:30.158425 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.158411 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.158399375 +0000 UTC m=+37.190450586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:30.259274 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.259198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:30.259421 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.259279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:30.259421 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259307 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:30.259421 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.259316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:30.259421 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259365 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.259345464 +0000 UTC m=+37.291396696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259424 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.259413894 +0000 UTC m=+37.291465100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259469 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.259487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.259494294 +0000 UTC m=+37.291545506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259571 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:30.259661 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.259605 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.259595458 +0000 UTC m=+37.291646664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:30.360245 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.360210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:30.360975 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.360260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:30.360975 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.360387 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:30.360975 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.360438 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:30.360975 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.360450 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.360430361 +0000 UTC m=+37.392481572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:30.360975 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:30.360488 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:32.360472162 +0000 UTC m=+37.392523376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:30.698582 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.698542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" event={"ID":"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875","Type":"ContainerStarted","Data":"89d6372b35af7a47b9d011bbb40c05531b4511eeeff894af0173549caa828e41"} Apr 21 07:03:30.702502 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.702440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" event={"ID":"de46750f-df1b-4469-a3bd-4300d5fa0f79","Type":"ContainerStarted","Data":"6a30ba2e55695f24fb3a8effd3e70c843bef7db7d391c637f1d6e0d266cdd658"} Apr 21 07:03:30.706332 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.706268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" event={"ID":"44af391c-8f7a-471b-a4eb-25f3b5519c86","Type":"ContainerStarted","Data":"f3f57b53748e4d6e204b692079a0446ff264ba01c4af32dfc6de9370a966c19f"} Apr 21 07:03:30.711308 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.711280 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="cd8e7ad4f63fbfbad8adb8f5a244126db5b5064d6c4275e5a21f325b9c430ea2" exitCode=0 Apr 21 07:03:30.711430 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.711351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"cd8e7ad4f63fbfbad8adb8f5a244126db5b5064d6c4275e5a21f325b9c430ea2"} Apr 21 07:03:30.723199 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.723125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qv6dn" event={"ID":"5affaa81-79dd-4de7-85b9-98182a2406f0","Type":"ContainerStarted","Data":"bee245fb09f72e633c866fc7b6873f5e31e3331104677f6a67f0da8a37fb8ddd"} Apr 21 07:03:30.725594 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.725540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" event={"ID":"22ef3159-4fb3-4a8b-8264-e9ee14be3a04","Type":"ContainerStarted","Data":"edfd133f048d9bb1e03c520466775782dea98671935d8d31a22e6cee2d596700"} Apr 21 07:03:30.727148 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.727103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" event={"ID":"8da13794-b67d-4df5-9370-57ce6358959a","Type":"ContainerStarted","Data":"6c2cf6b8235b68bcc6f3a148a2b00f30f3d0b3fffb6c51d08ad5793bc31fb331"} Apr 21 07:03:30.729096 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:30.729055 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" event={"ID":"bb10b08b-2c33-4546-889b-697fd8825b2f","Type":"ContainerStarted","Data":"8721d59bffd350a38cc0742fc8b454983ba58b018e76916893a5920614e4a77d"} Apr 21 07:03:31.750677 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:31.749712 2573 generic.go:358] "Generic (PLEG): container finished" podID="8ce849fd-7b86-4acc-b03c-5583cbf4cc68" containerID="5f04e98513a22b73cfc8a8cc98fb44aaa2de445b45e57dfb185d974d711574fd" exitCode=0 Apr 21 07:03:31.750677 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:31.749781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerDied","Data":"5f04e98513a22b73cfc8a8cc98fb44aaa2de445b45e57dfb185d974d711574fd"} Apr 21 07:03:32.180601 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.180502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:32.180796 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.180710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:32.180878 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.180799 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:32.180878 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.180865 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.180846746 +0000 UTC m=+41.212897954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:32.180992 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.180801 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:32.180992 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.180890 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:32.180992 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.180919 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.180909975 +0000 UTC m=+41.212961184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:32.281820 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.281785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:32.281969 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.281916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:32.281969 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.281958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:32.282082 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.282016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:32.282149 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282134 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:32.282208 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282198 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.282180429 +0000 UTC m=+41.314231640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:32.282472 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282448 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:32.282602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282481 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:32.282602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282523 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.28249123 +0000 UTC m=+41.314542453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:32.282602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.282546349 +0000 UTC m=+41.314597562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:32.282602 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.282584 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.282572485 +0000 UTC m=+41.314623695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:32.382935 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.382899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:32.383107 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.382958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:32.383107 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.383065 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:32.383286 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.383135 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.38311308 +0000 UTC m=+41.415164306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:32.383286 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.383179 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:32.383286 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:32.383239 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.383223626 +0000 UTC m=+41.415274837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:32.987752 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.987705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:32.992375 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:32.992317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/240ddbe6-7b0f-4f03-9c28-38b3756ea88b-original-pull-secret\") pod \"global-pull-secret-syncer-mqxbs\" (UID: \"240ddbe6-7b0f-4f03-9c28-38b3756ea88b\") " pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:33.067613 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:33.067579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqxbs" Apr 21 07:03:36.217027 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.216985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.217049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.217104 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.217183 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.217160086 +0000 UTC m=+49.249211292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.217227 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.217250 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:36.217529 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.217313 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.217299732 +0000 UTC m=+49.249350937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:36.317802 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.317766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:36.317959 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.317816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:36.317959 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.317918 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:36.317959 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.317926 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:36.318087 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.317976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.317958292 +0000 UTC m=+49.350009527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:36.318087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.318012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:36.318087 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.318062 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.318039694 +0000 UTC m=+49.350090917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:36.318087 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.318081 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:36.318278 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.318108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:36.318278 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.318114 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.318104145 +0000 UTC m=+49.350155352 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:36.318278 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.318192 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.318181774 +0000 UTC m=+49.350232985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:36.419348 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.419302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:36.419550 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.419363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:36.419550 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.419487 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:36.419550 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.419547 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:36.419728 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.419602 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.419585239 +0000 UTC m=+49.451636448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:36.419728 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:36.419625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.4196139 +0000 UTC m=+49.451665111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:36.969776 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:36.969749 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mqxbs"] Apr 21 07:03:37.771529 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.766142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" event={"ID":"8da13794-b67d-4df5-9370-57ce6358959a","Type":"ContainerStarted","Data":"b8c4bebade65861d48f125c5a88d73c9046de93f8bff160d4aff1c32a80640ed"} Apr 21 07:03:37.771529 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.769972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" event={"ID":"bb10b08b-2c33-4546-889b-697fd8825b2f","Type":"ContainerStarted","Data":"f37f395b8c78b974dd1e671db1273777839c0781784aef2f15d19e4bce552fe7"} Apr 21 07:03:37.772030 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.771862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" event={"ID":"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875","Type":"ContainerStarted","Data":"bb27e7daa5bd2c9687efc400b31c1d25ab18210a352f6a407a0b4efdbb9574c8"} Apr 21 07:03:37.775393 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.775366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" event={"ID":"de46750f-df1b-4469-a3bd-4300d5fa0f79","Type":"ContainerStarted","Data":"f2550efa57ad6c11089fa941e74b157e7271fe58623c4c65a99551a7087cd2b1"} Apr 21 07:03:37.777766 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.777743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" event={"ID":"44af391c-8f7a-471b-a4eb-25f3b5519c86","Type":"ContainerStarted","Data":"925dd8392451f213aefb321c1697c337c47fa8198a137922b6dd4da8ce6462ec"} Apr 21 07:03:37.784598 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.784572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" event={"ID":"8ce849fd-7b86-4acc-b03c-5583cbf4cc68","Type":"ContainerStarted","Data":"3217aea743940906484c3919cf0b9be596eb47acca24e433d99433b9479e3a7e"} Apr 21 07:03:37.787082 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.787060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qv6dn" event={"ID":"5affaa81-79dd-4de7-85b9-98182a2406f0","Type":"ContainerStarted","Data":"16f8c25c012674d2cc456120e9a9edd6c41d0e68c75568d746816ddf57b944a9"} Apr 21 07:03:37.787456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.787441 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:03:37.790598 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.790580 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/0.log" Apr 21 07:03:37.790698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.790614 2573 generic.go:358] "Generic (PLEG): container finished" podID="22ef3159-4fb3-4a8b-8264-e9ee14be3a04" containerID="bec6b2597564aeefc5d8a1fb3c40c59fbc153b5428e02d2ae2456b17d415cb9d" exitCode=255 Apr 21 07:03:37.790698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.790670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" event={"ID":"22ef3159-4fb3-4a8b-8264-e9ee14be3a04","Type":"ContainerDied","Data":"bec6b2597564aeefc5d8a1fb3c40c59fbc153b5428e02d2ae2456b17d415cb9d"} Apr 21 07:03:37.791634 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.791617 2573 scope.go:117] "RemoveContainer" containerID="bec6b2597564aeefc5d8a1fb3c40c59fbc153b5428e02d2ae2456b17d415cb9d" Apr 21 07:03:37.793958 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.793928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mqxbs" event={"ID":"240ddbe6-7b0f-4f03-9c28-38b3756ea88b","Type":"ContainerStarted","Data":"7f0d9730f7d6ea03e4d60c4fb97791436da5152360a4239c217e53cc6ca4acd0"} Apr 21 07:03:37.804473 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.803811 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" podStartSLOduration=32.704283114 podStartE2EDuration="39.80379703s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.717190938 +0000 UTC m=+34.749242143" lastFinishedPulling="2026-04-21 07:03:36.816704842 +0000 UTC m=+41.848756059" observedRunningTime="2026-04-21 07:03:37.803247782 +0000 UTC m=+42.835299015" watchObservedRunningTime="2026-04-21 07:03:37.80379703 +0000 UTC m=+42.835848257" Apr 21 07:03:37.812640 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.812596 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9kqbb" podStartSLOduration=32.692797186 podStartE2EDuration="39.812583064s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.696789336 +0000 UTC m=+34.728840549" lastFinishedPulling="2026-04-21 07:03:36.816575207 +0000 UTC m=+41.848626427" observedRunningTime="2026-04-21 07:03:37.78371029 +0000 UTC m=+42.815761521" watchObservedRunningTime="2026-04-21 07:03:37.812583064 +0000 UTC m=+42.844634295" Apr 21 07:03:37.826127 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.825590 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" podStartSLOduration=32.689172571 podStartE2EDuration="39.825574421s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.695803071 +0000 UTC m=+34.727854292" lastFinishedPulling="2026-04-21 07:03:36.832204933 +0000 UTC m=+41.864256142" observedRunningTime="2026-04-21 07:03:37.823948852 +0000 UTC m=+42.856000085" watchObservedRunningTime="2026-04-21 07:03:37.825574421 +0000 UTC m=+42.857625651" Apr 21 07:03:37.841080 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.841034 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" podStartSLOduration=32.741402778 podStartE2EDuration="39.841017926s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.716951963 +0000 UTC m=+34.749003184" lastFinishedPulling="2026-04-21 07:03:36.816567126 +0000 UTC m=+41.848618332" observedRunningTime="2026-04-21 07:03:37.840237735 +0000 UTC m=+42.872288964" watchObservedRunningTime="2026-04-21 07:03:37.841017926 +0000 UTC m=+42.873069156" Apr 21 07:03:37.860917 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.860868 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qpjps" podStartSLOduration=32.712692637 podStartE2EDuration="39.860848695s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.697490059 +0000 UTC m=+34.729541264" lastFinishedPulling="2026-04-21 07:03:36.845646115 +0000 UTC m=+41.877697322" observedRunningTime="2026-04-21 07:03:37.859817725 +0000 UTC m=+42.891868950" watchObservedRunningTime="2026-04-21 07:03:37.860848695 +0000 UTC m=+42.892899924" Apr 21 07:03:37.890534 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.890453 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rlt5l" podStartSLOduration=11.454835783 podStartE2EDuration="42.890434747s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:02:58.305173372 +0000 UTC m=+3.337224596" lastFinishedPulling="2026-04-21 07:03:29.740772354 +0000 UTC m=+34.772823560" observedRunningTime="2026-04-21 07:03:37.887162333 +0000 UTC m=+42.919213562" watchObservedRunningTime="2026-04-21 07:03:37.890434747 +0000 UTC m=+42.922485978" Apr 21 07:03:37.905847 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:37.905709 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qv6dn" podStartSLOduration=35.788296205 podStartE2EDuration="42.905692251s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.717005691 +0000 UTC m=+34.749056896" lastFinishedPulling="2026-04-21 07:03:36.834401736 +0000 UTC m=+41.866452942" observedRunningTime="2026-04-21 07:03:37.904219631 +0000 UTC m=+42.936270861" watchObservedRunningTime="2026-04-21 07:03:37.905692251 +0000 UTC m=+42.937743477" Apr 21 07:03:38.684642 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.684591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:38.684642 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.684626 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:38.802083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.802053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:03:38.802756 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.802720 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/0.log" Apr 21 07:03:38.802890 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.802758 2573 generic.go:358] "Generic (PLEG): container finished" podID="22ef3159-4fb3-4a8b-8264-e9ee14be3a04" containerID="41e3e559d47463ef20484fcdea6d3f181f61502e89f549fd15230bf80c65ee5f" exitCode=255 Apr 21 07:03:38.803464 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.803266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" event={"ID":"22ef3159-4fb3-4a8b-8264-e9ee14be3a04","Type":"ContainerDied","Data":"41e3e559d47463ef20484fcdea6d3f181f61502e89f549fd15230bf80c65ee5f"} Apr 21 07:03:38.803464 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.803336 2573 scope.go:117] "RemoveContainer" containerID="bec6b2597564aeefc5d8a1fb3c40c59fbc153b5428e02d2ae2456b17d415cb9d" Apr 21 07:03:38.804337 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:38.804080 2573 scope.go:117] "RemoveContainer" containerID="41e3e559d47463ef20484fcdea6d3f181f61502e89f549fd15230bf80c65ee5f" Apr 21 07:03:38.804337 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:38.804253 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4gb9q_openshift-console-operator(22ef3159-4fb3-4a8b-8264-e9ee14be3a04)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" podUID="22ef3159-4fb3-4a8b-8264-e9ee14be3a04" Apr 21 07:03:39.807040 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:39.807006 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:03:39.807487 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:39.807430 2573 scope.go:117] "RemoveContainer" containerID="41e3e559d47463ef20484fcdea6d3f181f61502e89f549fd15230bf80c65ee5f" Apr 21 07:03:39.807656 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:39.807632 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4gb9q_openshift-console-operator(22ef3159-4fb3-4a8b-8264-e9ee14be3a04)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" podUID="22ef3159-4fb3-4a8b-8264-e9ee14be3a04" Apr 21 07:03:41.080127 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:41.080100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q7rfs_2f25cb44-1f59-45ee-8bd4-d80ef4c1366b/dns-node-resolver/0.log" Apr 21 07:03:41.813362 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:41.813329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mqxbs" event={"ID":"240ddbe6-7b0f-4f03-9c28-38b3756ea88b","Type":"ContainerStarted","Data":"33ed4b13cfd44d66648f509f529fab0267128bb008c40fcaefcb5ae2536cb365"} Apr 21 07:03:41.833942 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:41.833898 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mqxbs" podStartSLOduration=36.414539094 podStartE2EDuration="40.833884898s" podCreationTimestamp="2026-04-21 07:03:01 +0000 UTC" firstStartedPulling="2026-04-21 07:03:36.976599681 +0000 UTC m=+42.008650894" lastFinishedPulling="2026-04-21 07:03:41.395945492 +0000 UTC m=+46.427996698" observedRunningTime="2026-04-21 07:03:41.833525574 +0000 UTC m=+46.865576791" watchObservedRunningTime="2026-04-21 07:03:41.833884898 +0000 UTC m=+46.865936127" Apr 21 07:03:42.080446 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:42.080369 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zzsbr_85559148-4ea4-4bfd-8bf0-55be583da361/node-ca/0.log" Apr 21 07:03:44.295747 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.295708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:03:44.295747 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.295753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:03:44.296261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.295885 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:44.296261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.295922 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:03:44.296261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.295933 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d6f45d765-cf7kk: secret "image-registry-tls" not found Apr 21 07:03:44.296261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.295971 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls podName:71f6c66b-8c36-497f-9098-f070725c4d1d nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.295949626 +0000 UTC m=+65.328000844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6rh5r" (UID: "71f6c66b-8c36-497f-9098-f070725c4d1d") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:03:44.296261 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.295993 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls podName:e2bfa921-c09b-4485-a7a3-a08eebc1ceba nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.295981542 +0000 UTC m=+65.328032755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls") pod "image-registry-5d6f45d765-cf7kk" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba") : secret "image-registry-tls" not found Apr 21 07:03:44.396216 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.396187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:44.396401 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.396235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:03:44.396401 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396323 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:03:44.396401 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396343 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:03:44.396401 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396380 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.3963649 +0000 UTC m=+65.428416110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : secret "router-metrics-certs-default" not found Apr 21 07:03:44.396401 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396397 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls podName:ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.396389724 +0000 UTC m=+65.428440930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z97m" (UID: "ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2") : secret "samples-operator-tls" not found Apr 21 07:03:44.396683 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.396414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:03:44.396683 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.396456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:03:44.396683 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle podName:b6abfd8a-5d8b-4af8-94a1-95cf455336e0 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.396558199 +0000 UTC m=+65.428609404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle") pod "router-default-57f7f9fd66-mtt95" (UID: "b6abfd8a-5d8b-4af8-94a1-95cf455336e0") : configmap references non-existent config key: service-ca.crt Apr 21 07:03:44.396683 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396592 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:03:44.396683 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.396641 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert podName:fd0f8ab2-d283-4079-8826-80cb40b62cab nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.396627856 +0000 UTC m=+65.428679081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-d8wdr" (UID: "fd0f8ab2-d283-4079-8826-80cb40b62cab") : secret "networking-console-plugin-cert" not found Apr 21 07:03:44.497449 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.497416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:03:44.497449 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:44.497449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:03:44.497655 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.497574 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:44.497655 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.497620 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert podName:0816ede2-8af6-41c9-b423-5c313bc38315 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.497605775 +0000 UTC m=+65.529656981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert") pod "ingress-canary-vrfxd" (UID: "0816ede2-8af6-41c9-b423-5c313bc38315") : secret "canary-serving-cert" not found Apr 21 07:03:44.497655 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.497574 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:44.497751 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:03:44.497714 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls podName:4519c586-5721-4cb3-bffc-7f4b13237ef7 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:00.497697573 +0000 UTC m=+65.529748782 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls") pod "dns-default-wknz9" (UID: "4519c586-5721-4cb3-bffc-7f4b13237ef7") : secret "dns-default-metrics-tls" not found Apr 21 07:03:48.684367 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.684338 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:48.684367 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.684369 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:48.684901 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.684722 2573 scope.go:117] "RemoveContainer" containerID="41e3e559d47463ef20484fcdea6d3f181f61502e89f549fd15230bf80c65ee5f" Apr 21 07:03:48.831522 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.831482 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:03:48.831664 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.831576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" event={"ID":"22ef3159-4fb3-4a8b-8264-e9ee14be3a04","Type":"ContainerStarted","Data":"5b1e814663e5f7fefa99516ebb70d0880371d648a762571d359a40b0cf750c8a"} Apr 21 07:03:48.831862 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.831840 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:48.848291 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:48.848251 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" podStartSLOduration=43.727272908 podStartE2EDuration="50.848239551s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:03:29.695742484 +0000 UTC m=+34.727793706" lastFinishedPulling="2026-04-21 07:03:36.816709134 +0000 UTC m=+41.848760349" observedRunningTime="2026-04-21 07:03:48.84709243 +0000 UTC m=+53.879143658" watchObservedRunningTime="2026-04-21 07:03:48.848239551 +0000 UTC m=+53.880290779" Apr 21 07:03:49.189823 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:49.189796 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4gb9q" Apr 21 07:03:54.695083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:03:54.695053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt7hd" Apr 21 07:04:00.324544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.324487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:04:00.324924 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.324559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:00.327321 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.327298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71f6c66b-8c36-497f-9098-f070725c4d1d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6rh5r\" (UID: \"71f6c66b-8c36-497f-9098-f070725c4d1d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:04:00.327404 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.327306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"image-registry-5d6f45d765-cf7kk\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:00.425560 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.425525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:04:00.425711 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.425582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:04:00.425711 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.425617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:00.425711 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.425666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:00.426288 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.426261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-service-ca-bundle\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:00.427908 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.427882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z97m\" (UID: \"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:04:00.428115 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.428094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6abfd8a-5d8b-4af8-94a1-95cf455336e0-metrics-certs\") pod \"router-default-57f7f9fd66-mtt95\" (UID: \"b6abfd8a-5d8b-4af8-94a1-95cf455336e0\") " pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:00.428155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.428094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fd0f8ab2-d283-4079-8826-80cb40b62cab-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-d8wdr\" (UID: \"fd0f8ab2-d283-4079-8826-80cb40b62cab\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:04:00.500659 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.500631 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wwtlv\"" Apr 21 07:04:00.508959 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.508942 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:00.526924 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.526812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:04:00.526924 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.526858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:04:00.529294 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.529267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4519c586-5721-4cb3-bffc-7f4b13237ef7-metrics-tls\") pod \"dns-default-wknz9\" (UID: \"4519c586-5721-4cb3-bffc-7f4b13237ef7\") " pod="openshift-dns/dns-default-wknz9" Apr 21 07:04:00.529428 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.529336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0816ede2-8af6-41c9-b423-5c313bc38315-cert\") pod \"ingress-canary-vrfxd\" (UID: \"0816ede2-8af6-41c9-b423-5c313bc38315\") " pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:04:00.578839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.578778 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-zvxnz\"" Apr 21 07:04:00.586785 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.586758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" Apr 21 07:04:00.598452 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.598355 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-926tf\"" Apr 21 07:04:00.606204 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.606181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" Apr 21 07:04:00.608368 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.608339 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gssqw\"" Apr 21 07:04:00.616321 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.616300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:00.626792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.626768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:04:00.631533 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:00.631443 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bfa921_c09b_4485_a7a3_a08eebc1ceba.slice/crio-64b6b6020a1150e2081116abdc19c4cde7941535e8c41184ec96b6d381580dc3 WatchSource:0}: Error finding container 64b6b6020a1150e2081116abdc19c4cde7941535e8c41184ec96b6d381580dc3: Status 404 returned error can't find the container with id 64b6b6020a1150e2081116abdc19c4cde7941535e8c41184ec96b6d381580dc3 Apr 21 07:04:00.642578 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.642379 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pftx5\"" Apr 21 07:04:00.649937 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.649624 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" Apr 21 07:04:00.697395 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.697189 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-f8s28\"" Apr 21 07:04:00.706477 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.703751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vrfxd" Apr 21 07:04:00.706477 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.706067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8pbf\"" Apr 21 07:04:00.712626 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.712038 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wknz9" Apr 21 07:04:00.760810 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.760649 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r"] Apr 21 07:04:00.769315 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:00.769255 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f6c66b_8c36_497f_9098_f070725c4d1d.slice/crio-502f55a63c9dd85d034098664a73b0b6b3f9164f2348756e727b6ac447422e28 WatchSource:0}: Error finding container 502f55a63c9dd85d034098664a73b0b6b3f9164f2348756e727b6ac447422e28: Status 404 returned error can't find the container with id 502f55a63c9dd85d034098664a73b0b6b3f9164f2348756e727b6ac447422e28 Apr 21 07:04:00.807503 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.807433 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m"] Apr 21 07:04:00.833424 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.833388 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57f7f9fd66-mtt95"] Apr 21 07:04:00.838632 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:00.838600 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6abfd8a_5d8b_4af8_94a1_95cf455336e0.slice/crio-2161faec767139d43f98741a4c2f68ff2b6f22d04b5957818b8d11b0b1564298 WatchSource:0}: Error finding container 2161faec767139d43f98741a4c2f68ff2b6f22d04b5957818b8d11b0b1564298: Status 404 returned error can't find the container with id 2161faec767139d43f98741a4c2f68ff2b6f22d04b5957818b8d11b0b1564298 Apr 21 07:04:00.847176 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.847122 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr"] Apr 21 07:04:00.861340 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.861301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" event={"ID":"fd0f8ab2-d283-4079-8826-80cb40b62cab","Type":"ContainerStarted","Data":"fa73acce95a42f3e7d4bb1cbd192ed2b6dea8d9cc77a54289bfec243ef1e2e1b"} Apr 21 07:04:00.862687 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.862660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" event={"ID":"71f6c66b-8c36-497f-9098-f070725c4d1d","Type":"ContainerStarted","Data":"502f55a63c9dd85d034098664a73b0b6b3f9164f2348756e727b6ac447422e28"} Apr 21 07:04:00.864111 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.863791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" event={"ID":"b6abfd8a-5d8b-4af8-94a1-95cf455336e0","Type":"ContainerStarted","Data":"2161faec767139d43f98741a4c2f68ff2b6f22d04b5957818b8d11b0b1564298"} Apr 21 07:04:00.865668 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.865642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" event={"ID":"e2bfa921-c09b-4485-a7a3-a08eebc1ceba","Type":"ContainerStarted","Data":"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c"} Apr 21 07:04:00.865777 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.865676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" event={"ID":"e2bfa921-c09b-4485-a7a3-a08eebc1ceba","Type":"ContainerStarted","Data":"64b6b6020a1150e2081116abdc19c4cde7941535e8c41184ec96b6d381580dc3"} Apr 21 07:04:00.881765 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.881719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:00.882294 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.882119 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vrfxd"] Apr 21 07:04:00.885084 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:00.885061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0816ede2_8af6_41c9_b423_5c313bc38315.slice/crio-a434392c744bc0572becd60a81e1c3c5f326b281c6f9ef39d389cbbe41f5bd86 WatchSource:0}: Error finding container a434392c744bc0572becd60a81e1c3c5f326b281c6f9ef39d389cbbe41f5bd86: Status 404 returned error can't find the container with id a434392c744bc0572becd60a81e1c3c5f326b281c6f9ef39d389cbbe41f5bd86 Apr 21 07:04:00.903760 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.902389 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" podStartSLOduration=65.902373689 podStartE2EDuration="1m5.902373689s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:04:00.901900956 +0000 UTC m=+65.933952186" watchObservedRunningTime="2026-04-21 07:04:00.902373689 +0000 UTC m=+65.934424922" Apr 21 07:04:00.905285 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:00.905261 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wknz9"] Apr 21 07:04:00.908493 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:00.908472 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4519c586_5721_4cb3_bffc_7f4b13237ef7.slice/crio-8a54e36da8e5268ff83507cce6c9057b10ab86c4d2156a28aea2de5cc72b68b3 WatchSource:0}: Error finding container 8a54e36da8e5268ff83507cce6c9057b10ab86c4d2156a28aea2de5cc72b68b3: Status 404 returned error can't find the container with id 8a54e36da8e5268ff83507cce6c9057b10ab86c4d2156a28aea2de5cc72b68b3 Apr 21 07:04:01.337433 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.337396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:04:01.340031 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.340002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e4d3a3ee-1584-42b6-a403-4bb39d451cab-metrics-certs\") pod \"network-metrics-daemon-qzgxt\" (UID: \"e4d3a3ee-1584-42b6-a403-4bb39d451cab\") " pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:04:01.585182 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.585131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5rqg2\"" Apr 21 07:04:01.593351 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.593272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qzgxt" Apr 21 07:04:01.771547 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.771493 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qzgxt"] Apr 21 07:04:01.775912 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:01.775884 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d3a3ee_1584_42b6_a403_4bb39d451cab.slice/crio-172cb2a873ebbc800ea111a95ec7ecdaac3eaa1bf2aafb3eac6d91cffc11d243 WatchSource:0}: Error finding container 172cb2a873ebbc800ea111a95ec7ecdaac3eaa1bf2aafb3eac6d91cffc11d243: Status 404 returned error can't find the container with id 172cb2a873ebbc800ea111a95ec7ecdaac3eaa1bf2aafb3eac6d91cffc11d243 Apr 21 07:04:01.871903 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.871801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wknz9" event={"ID":"4519c586-5721-4cb3-bffc-7f4b13237ef7","Type":"ContainerStarted","Data":"8a54e36da8e5268ff83507cce6c9057b10ab86c4d2156a28aea2de5cc72b68b3"} Apr 21 07:04:01.875002 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.874968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" event={"ID":"b6abfd8a-5d8b-4af8-94a1-95cf455336e0","Type":"ContainerStarted","Data":"c317c935a8ec56aee03b4478ecb8c43e773f8ac446bd1c4203b761c5a1d3c846"} Apr 21 07:04:01.876988 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.876961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzgxt" event={"ID":"e4d3a3ee-1584-42b6-a403-4bb39d451cab","Type":"ContainerStarted","Data":"172cb2a873ebbc800ea111a95ec7ecdaac3eaa1bf2aafb3eac6d91cffc11d243"} Apr 21 07:04:01.879563 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.879527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vrfxd" event={"ID":"0816ede2-8af6-41c9-b423-5c313bc38315","Type":"ContainerStarted","Data":"a434392c744bc0572becd60a81e1c3c5f326b281c6f9ef39d389cbbe41f5bd86"} Apr 21 07:04:01.881837 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.881790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" event={"ID":"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2","Type":"ContainerStarted","Data":"94db6234c5be033ae047fe07a48c8812c79e3b648aa0e703b97957351838c1d4"} Apr 21 07:04:01.896217 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:01.894676 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" podStartSLOduration=63.894661078 podStartE2EDuration="1m3.894661078s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:04:01.894384656 +0000 UTC m=+66.926435885" watchObservedRunningTime="2026-04-21 07:04:01.894661078 +0000 UTC m=+66.926712308" Apr 21 07:04:02.561684 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.561641 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fjqc2"] Apr 21 07:04:02.586314 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.586282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fjqc2"] Apr 21 07:04:02.586476 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.586454 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.589152 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.588996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:04:02.589152 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.589009 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l89lk\"" Apr 21 07:04:02.589152 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.589033 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:04:02.617176 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.617132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:02.620083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.620052 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:02.651823 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.651744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.651823 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.651791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a82f45ca-4a3a-421a-9360-c03b95c5ce27-crio-socket\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.652049 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.651857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a82f45ca-4a3a-421a-9360-c03b95c5ce27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.652049 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.651937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhfx\" (UniqueName: \"kubernetes.io/projected/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-api-access-9qhfx\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.652049 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.652009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a82f45ca-4a3a-421a-9360-c03b95c5ce27-data-volume\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.752669 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.752630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a82f45ca-4a3a-421a-9360-c03b95c5ce27-data-volume\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.752906 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.752707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.752906 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.752749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a82f45ca-4a3a-421a-9360-c03b95c5ce27-crio-socket\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.752906 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.752804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a82f45ca-4a3a-421a-9360-c03b95c5ce27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.752906 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.752883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhfx\" (UniqueName: \"kubernetes.io/projected/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-api-access-9qhfx\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.753159 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.753018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a82f45ca-4a3a-421a-9360-c03b95c5ce27-data-volume\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.753159 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.753106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a82f45ca-4a3a-421a-9360-c03b95c5ce27-crio-socket\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.753689 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.753650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.756017 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.755996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a82f45ca-4a3a-421a-9360-c03b95c5ce27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.781723 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.781690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhfx\" (UniqueName: \"kubernetes.io/projected/a82f45ca-4a3a-421a-9360-c03b95c5ce27-kube-api-access-9qhfx\") pod \"insights-runtime-extractor-fjqc2\" (UID: \"a82f45ca-4a3a-421a-9360-c03b95c5ce27\") " pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:02.885625 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.884985 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:02.886649 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.886621 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-57f7f9fd66-mtt95" Apr 21 07:04:02.899203 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:02.899178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fjqc2" Apr 21 07:04:05.910264 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:05.910188 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fjqc2"] Apr 21 07:04:05.915106 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:05.915076 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda82f45ca_4a3a_421a_9360_c03b95c5ce27.slice/crio-1931fa1b50d57f36cbf34ae36e97821ad3910c7f2764044145c4c2e6076182ad WatchSource:0}: Error finding container 1931fa1b50d57f36cbf34ae36e97821ad3910c7f2764044145c4c2e6076182ad: Status 404 returned error can't find the container with id 1931fa1b50d57f36cbf34ae36e97821ad3910c7f2764044145c4c2e6076182ad Apr 21 07:04:06.487347 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.487309 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45"] Apr 21 07:04:06.506421 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.506384 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45"] Apr 21 07:04:06.506553 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.506424 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:06.508866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.508751 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 07:04:06.508866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.508767 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-gptnx\"" Apr 21 07:04:06.586246 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.586215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8f070b4-00c8-456b-802c-4794f5d87b21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-87n45\" (UID: \"f8f070b4-00c8-456b-802c-4794f5d87b21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:06.686592 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.686532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8f070b4-00c8-456b-802c-4794f5d87b21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-87n45\" (UID: \"f8f070b4-00c8-456b-802c-4794f5d87b21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:06.688923 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.688899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8f070b4-00c8-456b-802c-4794f5d87b21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-87n45\" (UID: \"f8f070b4-00c8-456b-802c-4794f5d87b21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:06.815755 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.815723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:06.897415 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.897383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vrfxd" event={"ID":"0816ede2-8af6-41c9-b423-5c313bc38315","Type":"ContainerStarted","Data":"1eba033e4560f3b7ddfe40aeede54d37ff1d93f6d5b934cef64ef684668b1a5d"} Apr 21 07:04:06.900050 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.899544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" event={"ID":"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2","Type":"ContainerStarted","Data":"6475ed7caad676b6ab7347463d410bb223665a28aeb1398feb0189f2b9d3405b"} Apr 21 07:04:06.900050 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.899582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" event={"ID":"ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2","Type":"ContainerStarted","Data":"2facc8a7e86ca4d313e9666003f9734b871897b8d5d550f9f70c141e93979c1b"} Apr 21 07:04:06.903386 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.903362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" event={"ID":"71f6c66b-8c36-497f-9098-f070725c4d1d","Type":"ContainerStarted","Data":"e323d2004d7d57a0584ce1b946de2f05271316315aff262c97171a6f1303b71b"} Apr 21 07:04:06.904879 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.904856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wknz9" event={"ID":"4519c586-5721-4cb3-bffc-7f4b13237ef7","Type":"ContainerStarted","Data":"01d680814308187e734aaa194523f13f37cb1a014b8a8a970f927033aa5d43f9"} Apr 21 07:04:06.904981 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.904888 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wknz9" event={"ID":"4519c586-5721-4cb3-bffc-7f4b13237ef7","Type":"ContainerStarted","Data":"9d136a9f9a1a7bd643aab37f3cf04569a1526ccb47d2401120c5a64aef953de3"} Apr 21 07:04:06.904981 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.904954 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wknz9" Apr 21 07:04:06.906078 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.906054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjqc2" event={"ID":"a82f45ca-4a3a-421a-9360-c03b95c5ce27","Type":"ContainerStarted","Data":"0a28eaad6be97b8c987b8bee5d717df44aaa9dcee5fa40328739e1e9c8de3e8f"} Apr 21 07:04:06.906078 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.906081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjqc2" event={"ID":"a82f45ca-4a3a-421a-9360-c03b95c5ce27","Type":"ContainerStarted","Data":"1931fa1b50d57f36cbf34ae36e97821ad3910c7f2764044145c4c2e6076182ad"} Apr 21 07:04:06.907156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.907134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" event={"ID":"fd0f8ab2-d283-4079-8826-80cb40b62cab","Type":"ContainerStarted","Data":"a338d9ec038f6161831815ca59bcb1eb7e75530b5b14e4e79638f8378d8ff044"} Apr 21 07:04:06.908449 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.908431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzgxt" event={"ID":"e4d3a3ee-1584-42b6-a403-4bb39d451cab","Type":"ContainerStarted","Data":"8e915e6d702478f052b38c2bf6bd3a685d1cfbe85fdaa807fbb3659867bd2261"} Apr 21 07:04:06.908543 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.908452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qzgxt" event={"ID":"e4d3a3ee-1584-42b6-a403-4bb39d451cab","Type":"ContainerStarted","Data":"74746227f80598876354573f3027af09b08adc93e1a4080756a79308fff554cc"} Apr 21 07:04:06.914351 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.914312 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vrfxd" podStartSLOduration=34.015658435 podStartE2EDuration="38.914300774s" podCreationTimestamp="2026-04-21 07:03:28 +0000 UTC" firstStartedPulling="2026-04-21 07:04:00.88747576 +0000 UTC m=+65.919526966" lastFinishedPulling="2026-04-21 07:04:05.786118087 +0000 UTC m=+70.818169305" observedRunningTime="2026-04-21 07:04:06.91371806 +0000 UTC m=+71.945769288" watchObservedRunningTime="2026-04-21 07:04:06.914300774 +0000 UTC m=+71.946352002" Apr 21 07:04:06.934165 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.934122 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6rh5r" podStartSLOduration=63.922384825 podStartE2EDuration="1m8.934111038s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:04:00.775468881 +0000 UTC m=+65.807520087" lastFinishedPulling="2026-04-21 07:04:05.78719508 +0000 UTC m=+70.819246300" observedRunningTime="2026-04-21 07:04:06.933456568 +0000 UTC m=+71.965507796" watchObservedRunningTime="2026-04-21 07:04:06.934111038 +0000 UTC m=+71.966162343" Apr 21 07:04:06.949925 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.949795 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z97m" podStartSLOduration=64.06830752 podStartE2EDuration="1m8.949782161s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:04:00.905503391 +0000 UTC m=+65.937554599" lastFinishedPulling="2026-04-21 07:04:05.786978034 +0000 UTC m=+70.819029240" observedRunningTime="2026-04-21 07:04:06.948919914 +0000 UTC m=+71.980971141" watchObservedRunningTime="2026-04-21 07:04:06.949782161 +0000 UTC m=+71.981833390" Apr 21 07:04:06.964405 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.964370 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-d8wdr" podStartSLOduration=64.036325321 podStartE2EDuration="1m8.964359932s" podCreationTimestamp="2026-04-21 07:02:58 +0000 UTC" firstStartedPulling="2026-04-21 07:04:00.858938019 +0000 UTC m=+65.890989231" lastFinishedPulling="2026-04-21 07:04:05.786972623 +0000 UTC m=+70.819023842" observedRunningTime="2026-04-21 07:04:06.963936146 +0000 UTC m=+71.995987374" watchObservedRunningTime="2026-04-21 07:04:06.964359932 +0000 UTC m=+71.996411159" Apr 21 07:04:06.981983 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.981938 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wknz9" podStartSLOduration=34.105178103 podStartE2EDuration="38.981925346s" podCreationTimestamp="2026-04-21 07:03:28 +0000 UTC" firstStartedPulling="2026-04-21 07:04:00.910634318 +0000 UTC m=+65.942685523" lastFinishedPulling="2026-04-21 07:04:05.78738156 +0000 UTC m=+70.819432766" observedRunningTime="2026-04-21 07:04:06.980807784 +0000 UTC m=+72.012859011" watchObservedRunningTime="2026-04-21 07:04:06.981925346 +0000 UTC m=+72.013976574" Apr 21 07:04:06.996874 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:06.996830 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qzgxt" podStartSLOduration=67.63691568 podStartE2EDuration="1m11.996815455s" podCreationTimestamp="2026-04-21 07:02:55 +0000 UTC" firstStartedPulling="2026-04-21 07:04:01.779346937 +0000 UTC m=+66.811398157" lastFinishedPulling="2026-04-21 07:04:06.139246726 +0000 UTC m=+71.171297932" observedRunningTime="2026-04-21 07:04:06.995469111 +0000 UTC m=+72.027520340" watchObservedRunningTime="2026-04-21 07:04:06.996815455 +0000 UTC m=+72.028866685" Apr 21 07:04:07.088226 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:07.088202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45"] Apr 21 07:04:07.090416 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:07.090385 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f070b4_00c8_456b_802c_4794f5d87b21.slice/crio-bd4d526c9e1efcca0ac6234f316a1cbe2459ca67e2a51dfaa7b38a20d54060d1 WatchSource:0}: Error finding container bd4d526c9e1efcca0ac6234f316a1cbe2459ca67e2a51dfaa7b38a20d54060d1: Status 404 returned error can't find the container with id bd4d526c9e1efcca0ac6234f316a1cbe2459ca67e2a51dfaa7b38a20d54060d1 Apr 21 07:04:07.913191 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:07.913154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjqc2" event={"ID":"a82f45ca-4a3a-421a-9360-c03b95c5ce27","Type":"ContainerStarted","Data":"cc85622ce1bb2b7b4421a6c023e972ad2fda05396f176f86d1e84726f50a1d2c"} Apr 21 07:04:07.914465 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:07.914393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" event={"ID":"f8f070b4-00c8-456b-802c-4794f5d87b21","Type":"ContainerStarted","Data":"bd4d526c9e1efcca0ac6234f316a1cbe2459ca67e2a51dfaa7b38a20d54060d1"} Apr 21 07:04:09.809251 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.809215 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qv6dn" Apr 21 07:04:09.920648 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.920618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fjqc2" event={"ID":"a82f45ca-4a3a-421a-9360-c03b95c5ce27","Type":"ContainerStarted","Data":"4e111f5bac897a3a8f55217c6a94df1da4bca2703d763cbb0014137809a640bb"} Apr 21 07:04:09.921848 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.921822 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" event={"ID":"f8f070b4-00c8-456b-802c-4794f5d87b21","Type":"ContainerStarted","Data":"64dadcac1abbb68d6053753be0673ac3aab3f34687e6f30b30d2c4b146055ead"} Apr 21 07:04:09.922015 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.921999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:09.926523 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.926486 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" Apr 21 07:04:09.954981 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.954932 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fjqc2" podStartSLOduration=5.059085341 podStartE2EDuration="7.954917482s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="2026-04-21 07:04:06.222128843 +0000 UTC m=+71.254180049" lastFinishedPulling="2026-04-21 07:04:09.117960984 +0000 UTC m=+74.150012190" observedRunningTime="2026-04-21 07:04:09.954387455 +0000 UTC m=+74.986438683" watchObservedRunningTime="2026-04-21 07:04:09.954917482 +0000 UTC m=+74.986968709" Apr 21 07:04:09.980166 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:09.980125 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-87n45" podStartSLOduration=1.957671955 podStartE2EDuration="3.980111435s" podCreationTimestamp="2026-04-21 07:04:06 +0000 UTC" firstStartedPulling="2026-04-21 07:04:07.092222627 +0000 UTC m=+72.124273833" lastFinishedPulling="2026-04-21 07:04:09.114662092 +0000 UTC m=+74.146713313" observedRunningTime="2026-04-21 07:04:09.979003636 +0000 UTC m=+75.011054857" watchObservedRunningTime="2026-04-21 07:04:09.980111435 +0000 UTC m=+75.012162663" Apr 21 07:04:10.571634 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.571604 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-stjpk"] Apr 21 07:04:10.574961 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.574946 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.578576 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.578556 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 07:04:10.578959 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.578941 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 07:04:10.578959 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.578957 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-kvgmf\"" Apr 21 07:04:10.579087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.579010 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:04:10.588709 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.588691 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-stjpk"] Apr 21 07:04:10.728986 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.728956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2l6z\" (UniqueName: \"kubernetes.io/projected/b0249fec-358f-462d-9041-e00bf841cdd3-kube-api-access-q2l6z\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.729129 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.729024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.729129 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.729062 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0249fec-358f-462d-9041-e00bf841cdd3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.729129 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.729087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.829554 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.829476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0249fec-358f-462d-9041-e00bf841cdd3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.829554 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.829527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.829957 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.829576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2l6z\" (UniqueName: \"kubernetes.io/projected/b0249fec-358f-462d-9041-e00bf841cdd3-kube-api-access-q2l6z\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.829957 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.829638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.830174 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.830155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0249fec-358f-462d-9041-e00bf841cdd3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.831807 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.831791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.831902 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.831883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0249fec-358f-462d-9041-e00bf841cdd3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.837304 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.837282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2l6z\" (UniqueName: \"kubernetes.io/projected/b0249fec-358f-462d-9041-e00bf841cdd3-kube-api-access-q2l6z\") pod \"prometheus-operator-5676c8c784-stjpk\" (UID: \"b0249fec-358f-462d-9041-e00bf841cdd3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.884214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.884193 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" Apr 21 07:04:10.997456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:10.997331 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-stjpk"] Apr 21 07:04:11.000110 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:11.000083 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0249fec_358f_462d_9041_e00bf841cdd3.slice/crio-41792a1d3f8e6f1f8017b102a608ddaad623c5b619c4716f8ab89060f2206cfd WatchSource:0}: Error finding container 41792a1d3f8e6f1f8017b102a608ddaad623c5b619c4716f8ab89060f2206cfd: Status 404 returned error can't find the container with id 41792a1d3f8e6f1f8017b102a608ddaad623c5b619c4716f8ab89060f2206cfd Apr 21 07:04:11.927837 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:11.927796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" event={"ID":"b0249fec-358f-462d-9041-e00bf841cdd3","Type":"ContainerStarted","Data":"41792a1d3f8e6f1f8017b102a608ddaad623c5b619c4716f8ab89060f2206cfd"} Apr 21 07:04:12.931641 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:12.931609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" event={"ID":"b0249fec-358f-462d-9041-e00bf841cdd3","Type":"ContainerStarted","Data":"8e949478164f25c4efe083a0b2087a780028dbdfcfe27eb9d394fa0d078a29f1"} Apr 21 07:04:12.931641 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:12.931643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" event={"ID":"b0249fec-358f-462d-9041-e00bf841cdd3","Type":"ContainerStarted","Data":"8df3662e1e55d0cdaf4ea54f3a06e0790a5e7df2198229ff7345e89bcf288f9b"} Apr 21 07:04:12.961917 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:12.961876 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-stjpk" podStartSLOduration=1.758494458 podStartE2EDuration="2.961862117s" podCreationTimestamp="2026-04-21 07:04:10 +0000 UTC" firstStartedPulling="2026-04-21 07:04:11.001971894 +0000 UTC m=+76.034023100" lastFinishedPulling="2026-04-21 07:04:12.205339549 +0000 UTC m=+77.237390759" observedRunningTime="2026-04-21 07:04:12.960690561 +0000 UTC m=+77.992741788" watchObservedRunningTime="2026-04-21 07:04:12.961862117 +0000 UTC m=+77.993913344" Apr 21 07:04:14.972851 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:14.972725 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8sjbz"] Apr 21 07:04:14.995275 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:14.995251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:14.997568 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:14.997537 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:04:14.999987 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:14.999965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:04:15.000186 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.000169 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:04:15.000249 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.000195 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-j5sr4\"" Apr 21 07:04:15.166184 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjqg\" (UniqueName: \"kubernetes.io/projected/84902026-6f14-435c-bdd6-a7e07f14ac16-kube-api-access-6tjqg\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166184 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-sys\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166360 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-wtmp\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166360 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166360 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-accelerators-collector-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166360 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-metrics-client-ca\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166484 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-root\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166484 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-textfile\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.166484 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.166416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-tls\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267322 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267322 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-accelerators-collector-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-metrics-client-ca\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-root\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-textfile\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-tls\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjqg\" (UniqueName: \"kubernetes.io/projected/84902026-6f14-435c-bdd6-a7e07f14ac16-kube-api-access-6tjqg\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-root\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-sys\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267521 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-wtmp\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267937 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-wtmp\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267937 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84902026-6f14-435c-bdd6-a7e07f14ac16-sys\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.267937 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.267901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-accelerators-collector-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.268091 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.268067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-textfile\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.268377 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.268352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84902026-6f14-435c-bdd6-a7e07f14ac16-metrics-client-ca\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.269971 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.269951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.270502 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.270482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84902026-6f14-435c-bdd6-a7e07f14ac16-node-exporter-tls\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.289350 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.289330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjqg\" (UniqueName: \"kubernetes.io/projected/84902026-6f14-435c-bdd6-a7e07f14ac16-kube-api-access-6tjqg\") pod \"node-exporter-8sjbz\" (UID: \"84902026-6f14-435c-bdd6-a7e07f14ac16\") " pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.305446 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.305426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8sjbz" Apr 21 07:04:15.315153 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:15.315130 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84902026_6f14_435c_bdd6_a7e07f14ac16.slice/crio-37b58fcb7ec120e66ce02e4061d22fe76d48f31636ebc96745c5e761a3ed3a1b WatchSource:0}: Error finding container 37b58fcb7ec120e66ce02e4061d22fe76d48f31636ebc96745c5e761a3ed3a1b: Status 404 returned error can't find the container with id 37b58fcb7ec120e66ce02e4061d22fe76d48f31636ebc96745c5e761a3ed3a1b Apr 21 07:04:15.942314 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:15.942277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8sjbz" event={"ID":"84902026-6f14-435c-bdd6-a7e07f14ac16","Type":"ContainerStarted","Data":"37b58fcb7ec120e66ce02e4061d22fe76d48f31636ebc96745c5e761a3ed3a1b"} Apr 21 07:04:16.917548 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:16.917520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wknz9" Apr 21 07:04:16.946455 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:16.946426 2573 generic.go:358] "Generic (PLEG): container finished" podID="84902026-6f14-435c-bdd6-a7e07f14ac16" containerID="38de124d8f5054b886d08fd91f88c66b6dd5bec2cc471b28396350cd02fb78dd" exitCode=0 Apr 21 07:04:16.946612 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:16.946526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8sjbz" event={"ID":"84902026-6f14-435c-bdd6-a7e07f14ac16","Type":"ContainerDied","Data":"38de124d8f5054b886d08fd91f88c66b6dd5bec2cc471b28396350cd02fb78dd"} Apr 21 07:04:17.950977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:17.950940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8sjbz" event={"ID":"84902026-6f14-435c-bdd6-a7e07f14ac16","Type":"ContainerStarted","Data":"0796874a9f68f4e6be431eab9864754eb21e8c0e5343fc611cea49cf65a638d4"} Apr 21 07:04:17.950977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:17.950977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8sjbz" event={"ID":"84902026-6f14-435c-bdd6-a7e07f14ac16","Type":"ContainerStarted","Data":"859c13158089946ed7fff0206255e2f40651f36821995d18bcbaf1c629df6dc3"} Apr 21 07:04:17.971823 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:17.971782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8sjbz" podStartSLOduration=2.973249439 podStartE2EDuration="3.971768668s" podCreationTimestamp="2026-04-21 07:04:14 +0000 UTC" firstStartedPulling="2026-04-21 07:04:15.317051681 +0000 UTC m=+80.349102893" lastFinishedPulling="2026-04-21 07:04:16.315570899 +0000 UTC m=+81.347622122" observedRunningTime="2026-04-21 07:04:17.969679924 +0000 UTC m=+83.001731151" watchObservedRunningTime="2026-04-21 07:04:17.971768668 +0000 UTC m=+83.003819895" Apr 21 07:04:19.715341 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.715308 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc"] Apr 21 07:04:19.718550 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.718531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:19.721131 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.721108 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6s57p\"" Apr 21 07:04:19.721234 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.721166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 07:04:19.726760 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.726740 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc"] Apr 21 07:04:19.907932 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:19.907902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r4jlc\" (UID: \"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:20.008997 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.008971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r4jlc\" (UID: \"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:20.009149 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:04:20.009092 2573 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 07:04:20.009189 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:04:20.009163 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert podName:1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:20.509148121 +0000 UTC m=+85.541199331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-r4jlc" (UID: "1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8") : secret "monitoring-plugin-cert" not found Apr 21 07:04:20.513731 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.513699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r4jlc\" (UID: \"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:20.516495 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.516459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r4jlc\" (UID: \"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:20.628579 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.628556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:20.747647 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.747618 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc"] Apr 21 07:04:20.751119 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:20.751083 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe2d2b7_47fc_4a5d_a8b5_19fc771065e8.slice/crio-5a68353a7b0e5b43882c83aec394e2b87b8f06ce3304ce8293251a31c77aa320 WatchSource:0}: Error finding container 5a68353a7b0e5b43882c83aec394e2b87b8f06ce3304ce8293251a31c77aa320: Status 404 returned error can't find the container with id 5a68353a7b0e5b43882c83aec394e2b87b8f06ce3304ce8293251a31c77aa320 Apr 21 07:04:20.896178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.896115 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:04:20.900529 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.900490 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:20.903042 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.903023 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 07:04:20.903136 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.903027 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 07:04:20.904136 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904119 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 07:04:20.904252 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904235 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 07:04:20.904307 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 07:04:20.904307 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904289 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 07:04:20.904307 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904302 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2rctr\"" Apr 21 07:04:20.904500 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.904486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 07:04:20.910182 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.910158 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 07:04:20.914070 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.914050 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:04:20.960438 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:20.960404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" event={"ID":"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8","Type":"ContainerStarted","Data":"5a68353a7b0e5b43882c83aec394e2b87b8f06ce3304ce8293251a31c77aa320"} Apr 21 07:04:21.018006 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.017967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018006 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbkb8\" (UniqueName: \"kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018188 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018188 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018188 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018348 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018195 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.018348 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.018220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119342 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119342 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbkb8\" (UniqueName: \"kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.119602 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.119529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.120312 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.120283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.120429 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.120337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.120496 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.120421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.120588 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.120571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.122475 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.122450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.122589 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.122499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.127123 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.127096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbkb8\" (UniqueName: \"kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8\") pod \"console-8555fcc9b-nts68\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.215640 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.215561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:21.358490 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.358465 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:04:21.360981 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:04:21.360957 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode287b8b8_650d_4318_ad39_c437f0b93ba3.slice/crio-058ba7e281dc7e1c2eecc57e229315d44eac9173e9a3462d3522e69ae78eb905 WatchSource:0}: Error finding container 058ba7e281dc7e1c2eecc57e229315d44eac9173e9a3462d3522e69ae78eb905: Status 404 returned error can't find the container with id 058ba7e281dc7e1c2eecc57e229315d44eac9173e9a3462d3522e69ae78eb905 Apr 21 07:04:21.886927 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.886895 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:21.965547 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:21.965478 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8555fcc9b-nts68" event={"ID":"e287b8b8-650d-4318-ad39-c437f0b93ba3","Type":"ContainerStarted","Data":"058ba7e281dc7e1c2eecc57e229315d44eac9173e9a3462d3522e69ae78eb905"} Apr 21 07:04:22.970243 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:22.970193 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" event={"ID":"1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8","Type":"ContainerStarted","Data":"ddd1cfa0195815e31d3193d2c26ebeaa94910fb7a35a3682a6bcd15a8b3c005b"} Apr 21 07:04:22.970698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:22.970393 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:22.975957 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:22.975930 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" Apr 21 07:04:22.986490 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:22.986451 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r4jlc" podStartSLOduration=2.462805355 podStartE2EDuration="3.986441044s" podCreationTimestamp="2026-04-21 07:04:19 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.75309869 +0000 UTC m=+85.785149896" lastFinishedPulling="2026-04-21 07:04:22.276734369 +0000 UTC m=+87.308785585" observedRunningTime="2026-04-21 07:04:22.98530344 +0000 UTC m=+88.017354651" watchObservedRunningTime="2026-04-21 07:04:22.986441044 +0000 UTC m=+88.018492280" Apr 21 07:04:24.378423 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:24.378335 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:04:24.976857 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:24.976818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8555fcc9b-nts68" event={"ID":"e287b8b8-650d-4318-ad39-c437f0b93ba3","Type":"ContainerStarted","Data":"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c"} Apr 21 07:04:24.995423 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:24.995361 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8555fcc9b-nts68" podStartSLOduration=2.3079384 podStartE2EDuration="4.995350181s" podCreationTimestamp="2026-04-21 07:04:20 +0000 UTC" firstStartedPulling="2026-04-21 07:04:21.362966768 +0000 UTC m=+86.395017974" lastFinishedPulling="2026-04-21 07:04:24.050378539 +0000 UTC m=+89.082429755" observedRunningTime="2026-04-21 07:04:24.993799387 +0000 UTC m=+90.025850615" watchObservedRunningTime="2026-04-21 07:04:24.995350181 +0000 UTC m=+90.027401409" Apr 21 07:04:31.216282 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:31.216250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:31.216757 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:31.216326 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:31.221377 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:31.221356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:32.005457 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:32.005430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:04:48.047013 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:48.046981 2573 generic.go:358] "Generic (PLEG): container finished" podID="44af391c-8f7a-471b-a4eb-25f3b5519c86" containerID="925dd8392451f213aefb321c1697c337c47fa8198a137922b6dd4da8ce6462ec" exitCode=0 Apr 21 07:04:48.047353 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:48.047056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" event={"ID":"44af391c-8f7a-471b-a4eb-25f3b5519c86","Type":"ContainerDied","Data":"925dd8392451f213aefb321c1697c337c47fa8198a137922b6dd4da8ce6462ec"} Apr 21 07:04:48.047394 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:48.047382 2573 scope.go:117] "RemoveContainer" containerID="925dd8392451f213aefb321c1697c337c47fa8198a137922b6dd4da8ce6462ec" Apr 21 07:04:49.051456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.051418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w9nvv" event={"ID":"44af391c-8f7a-471b-a4eb-25f3b5519c86","Type":"ContainerStarted","Data":"21e0e11df8f0346b4522121b8f32d10e33e77da0e5e29ea1bef8f3b2920dca59"} Apr 21 07:04:49.052654 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.052624 2573 generic.go:358] "Generic (PLEG): container finished" podID="de46750f-df1b-4469-a3bd-4300d5fa0f79" containerID="f2550efa57ad6c11089fa941e74b157e7271fe58623c4c65a99551a7087cd2b1" exitCode=0 Apr 21 07:04:49.052758 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.052682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" event={"ID":"de46750f-df1b-4469-a3bd-4300d5fa0f79","Type":"ContainerDied","Data":"f2550efa57ad6c11089fa941e74b157e7271fe58623c4c65a99551a7087cd2b1"} Apr 21 07:04:49.052959 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.052945 2573 scope.go:117] "RemoveContainer" containerID="f2550efa57ad6c11089fa941e74b157e7271fe58623c4c65a99551a7087cd2b1" Apr 21 07:04:49.399527 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.399425 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" podUID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" containerName="registry" containerID="cri-o://9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c" gracePeriod=30 Apr 21 07:04:49.646728 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.646706 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:49.728402 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728365 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728617 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728446 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728617 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728476 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728617 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728563 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728617 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728594 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728630 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728687 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjs9m\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.728839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.728720 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates\") pod \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\" (UID: \"e2bfa921-c09b-4485-a7a3-a08eebc1ceba\") " Apr 21 07:04:49.729483 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.729337 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:49.729786 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.729738 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:49.732085 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.732044 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:04:49.732253 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.732220 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:04:49.732751 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.732709 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:49.734015 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.733986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:49.734231 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.734195 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m" (OuterVolumeSpecName: "kube-api-access-gjs9m") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "kube-api-access-gjs9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:04:49.740578 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.740496 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2bfa921-c09b-4485-a7a3-a08eebc1ceba" (UID: "e2bfa921-c09b-4485-a7a3-a08eebc1ceba"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:04:49.829977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.829951 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-image-registry-private-configuration\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.829977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.829975 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-tls\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.829986 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-installation-pull-secrets\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.829995 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjs9m\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-kube-api-access-gjs9m\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.830004 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-registry-certificates\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.830013 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-bound-sa-token\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.830021 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-trusted-ca\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:49.830087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:49.830029 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2bfa921-c09b-4485-a7a3-a08eebc1ceba-ca-trust-extracted\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:04:50.056921 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.056834 2573 generic.go:358] "Generic (PLEG): container finished" podID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" containerID="9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c" exitCode=0 Apr 21 07:04:50.056921 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.056886 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" Apr 21 07:04:50.057366 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.056927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" event={"ID":"e2bfa921-c09b-4485-a7a3-a08eebc1ceba","Type":"ContainerDied","Data":"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c"} Apr 21 07:04:50.057366 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.056960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6f45d765-cf7kk" event={"ID":"e2bfa921-c09b-4485-a7a3-a08eebc1ceba","Type":"ContainerDied","Data":"64b6b6020a1150e2081116abdc19c4cde7941535e8c41184ec96b6d381580dc3"} Apr 21 07:04:50.057366 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.056978 2573 scope.go:117] "RemoveContainer" containerID="9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c" Apr 21 07:04:50.058724 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.058703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hq6q2" event={"ID":"de46750f-df1b-4469-a3bd-4300d5fa0f79","Type":"ContainerStarted","Data":"68a885ee0cb0f54d9a9c31605641cd4a68278f4868e7d25da95027d479c3d8f3"} Apr 21 07:04:50.065093 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.065078 2573 scope.go:117] "RemoveContainer" containerID="9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c" Apr 21 07:04:50.065362 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:04:50.065342 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c\": container with ID starting with 9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c not found: ID does not exist" containerID="9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c" Apr 21 07:04:50.065426 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.065374 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c"} err="failed to get container status \"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c\": rpc error: code = NotFound desc = could not find container \"9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c\": container with ID starting with 9bcdda338eab3c43f083f2f3022a81fb5bc1902edd48f382793fefbe059c5e0c not found: ID does not exist" Apr 21 07:04:50.094468 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.094447 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:04:50.103324 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:50.103302 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d6f45d765-cf7kk"] Apr 21 07:04:51.535100 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:51.535065 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" path="/var/lib/kubelet/pods/e2bfa921-c09b-4485-a7a3-a08eebc1ceba/volumes" Apr 21 07:04:53.074302 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:53.074270 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2986c84-0eaa-4d7a-a7c4-5337ab7f4875" containerID="bb27e7daa5bd2c9687efc400b31c1d25ab18210a352f6a407a0b4efdbb9574c8" exitCode=0 Apr 21 07:04:53.074671 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:53.074308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" event={"ID":"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875","Type":"ContainerDied","Data":"bb27e7daa5bd2c9687efc400b31c1d25ab18210a352f6a407a0b4efdbb9574c8"} Apr 21 07:04:53.074671 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:53.074614 2573 scope.go:117] "RemoveContainer" containerID="bb27e7daa5bd2c9687efc400b31c1d25ab18210a352f6a407a0b4efdbb9574c8" Apr 21 07:04:54.078524 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:04:54.078471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4pwc4" event={"ID":"c2986c84-0eaa-4d7a-a7c4-5337ab7f4875","Type":"ContainerStarted","Data":"1346a9acb27eb6c4ba25adc0acf276ec581bbd585dafe5658b6740d32ab8db74"} Apr 21 07:05:43.552450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.552412 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:05:43.553048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.552842 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" containerName="registry" Apr 21 07:05:43.553048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.552864 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" containerName="registry" Apr 21 07:05:43.553048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.552923 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2bfa921-c09b-4485-a7a3-a08eebc1ceba" containerName="registry" Apr 21 07:05:43.556198 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.556171 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.567445 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.567402 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:05:43.732357 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwpr\" (UniqueName: \"kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732687 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.732687 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.732594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833569 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833569 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833752 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833752 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwpr\" (UniqueName: \"kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833849 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833849 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.833948 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.833862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.834473 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.834452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.834607 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.834482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.834674 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.834659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.834862 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.834843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.836076 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.836052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.836161 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.836100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.841367 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.841347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwpr\" (UniqueName: \"kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr\") pod \"console-97755bb4-v8dwm\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.868387 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.868356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:43.983215 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:43.983186 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:05:43.986637 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:05:43.986613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1469328_39d0_4277_857c_203b03aa0a7b.slice/crio-4b1bce2bb29163156705c23ac5b1c05cc4e0ce76889fc0221a8ac05960a7f78f WatchSource:0}: Error finding container 4b1bce2bb29163156705c23ac5b1c05cc4e0ce76889fc0221a8ac05960a7f78f: Status 404 returned error can't find the container with id 4b1bce2bb29163156705c23ac5b1c05cc4e0ce76889fc0221a8ac05960a7f78f Apr 21 07:05:44.225723 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:44.225690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97755bb4-v8dwm" event={"ID":"b1469328-39d0-4277-857c-203b03aa0a7b","Type":"ContainerStarted","Data":"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b"} Apr 21 07:05:44.225723 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:44.225728 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97755bb4-v8dwm" event={"ID":"b1469328-39d0-4277-857c-203b03aa0a7b","Type":"ContainerStarted","Data":"4b1bce2bb29163156705c23ac5b1c05cc4e0ce76889fc0221a8ac05960a7f78f"} Apr 21 07:05:44.243777 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:44.243700 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-97755bb4-v8dwm" podStartSLOduration=1.243684195 podStartE2EDuration="1.243684195s" podCreationTimestamp="2026-04-21 07:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:05:44.242197656 +0000 UTC m=+169.274248884" watchObservedRunningTime="2026-04-21 07:05:44.243684195 +0000 UTC m=+169.275735423" Apr 21 07:05:53.869446 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:53.869390 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:53.869446 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:53.869449 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:53.874977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:53.874955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:54.257502 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.257473 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:05:54.305882 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.305848 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:05:54.437464 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.437432 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:05:54.440719 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.440702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.481369 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.481337 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:05:54.511164 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511101 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511164 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511294 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.511354 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.511350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h965n\" (UniqueName: \"kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.612655 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.612655 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.612891 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.612891 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.612891 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h965n\" (UniqueName: \"kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.613036 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.613036 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.612995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.613815 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.613785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.613967 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.613890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.616042 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.614169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.616042 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.614267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.616042 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.615616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.616872 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.616847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.622761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.622735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h965n\" (UniqueName: \"kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n\") pod \"console-65f995cd7c-tprdx\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.750600 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.750563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:05:54.887474 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:54.887448 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:05:54.889543 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:05:54.889473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ae8e48_9b0c_4c32_9bf8_6f49ea1e956e.slice/crio-aa3402488235691f0716274da09fa372d54bff0b86a2ebd53af3b7f1da979af6 WatchSource:0}: Error finding container aa3402488235691f0716274da09fa372d54bff0b86a2ebd53af3b7f1da979af6: Status 404 returned error can't find the container with id aa3402488235691f0716274da09fa372d54bff0b86a2ebd53af3b7f1da979af6 Apr 21 07:05:55.257625 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:55.257586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f995cd7c-tprdx" event={"ID":"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e","Type":"ContainerStarted","Data":"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd"} Apr 21 07:05:55.257625 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:55.257626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f995cd7c-tprdx" event={"ID":"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e","Type":"ContainerStarted","Data":"aa3402488235691f0716274da09fa372d54bff0b86a2ebd53af3b7f1da979af6"} Apr 21 07:05:55.278408 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:05:55.278369 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65f995cd7c-tprdx" podStartSLOduration=1.278356161 podStartE2EDuration="1.278356161s" podCreationTimestamp="2026-04-21 07:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:05:55.276588897 +0000 UTC m=+180.308640125" watchObservedRunningTime="2026-04-21 07:05:55.278356161 +0000 UTC m=+180.310407367" Apr 21 07:06:04.750791 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:04.750749 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:06:04.751198 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:04.750807 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:06:04.755404 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:04.755381 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:06:05.293644 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:05.293615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:06:05.349698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:05.349666 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:06:19.329806 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.329740 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8555fcc9b-nts68" podUID="e287b8b8-650d-4318-ad39-c437f0b93ba3" containerName="console" containerID="cri-o://0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c" gracePeriod=15 Apr 21 07:06:19.566183 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.566159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8555fcc9b-nts68_e287b8b8-650d-4318-ad39-c437f0b93ba3/console/0.log" Apr 21 07:06:19.566298 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.566220 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:06:19.604662 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604575 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.604662 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604613 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.604662 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604639 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbkb8\" (UniqueName: \"kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.604934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604835 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.604934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604911 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.605048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604944 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.605048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.604986 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config\") pod \"e287b8b8-650d-4318-ad39-c437f0b93ba3\" (UID: \"e287b8b8-650d-4318-ad39-c437f0b93ba3\") " Apr 21 07:06:19.605048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.605025 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca" (OuterVolumeSpecName: "service-ca") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:19.605761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.605315 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:19.605761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.605379 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-service-ca\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.605761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.605528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:19.606034 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.605877 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config" (OuterVolumeSpecName: "console-config") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:19.607053 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.607027 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:06:19.607053 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.607039 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8" (OuterVolumeSpecName: "kube-api-access-gbkb8") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "kube-api-access-gbkb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:06:19.607178 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.607051 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e287b8b8-650d-4318-ad39-c437f0b93ba3" (UID: "e287b8b8-650d-4318-ad39-c437f0b93ba3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:06:19.706037 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706003 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-oauth-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.706037 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706035 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbkb8\" (UniqueName: \"kubernetes.io/projected/e287b8b8-650d-4318-ad39-c437f0b93ba3-kube-api-access-gbkb8\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.706037 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706046 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.706264 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706056 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-oauth-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.706264 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706065 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-trusted-ca-bundle\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:19.706264 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:19.706074 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e287b8b8-650d-4318-ad39-c437f0b93ba3-console-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:20.333343 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333314 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8555fcc9b-nts68_e287b8b8-650d-4318-ad39-c437f0b93ba3/console/0.log" Apr 21 07:06:20.333792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333357 2573 generic.go:358] "Generic (PLEG): container finished" podID="e287b8b8-650d-4318-ad39-c437f0b93ba3" containerID="0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c" exitCode=2 Apr 21 07:06:20.333792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333426 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8555fcc9b-nts68" Apr 21 07:06:20.333792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333449 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8555fcc9b-nts68" event={"ID":"e287b8b8-650d-4318-ad39-c437f0b93ba3","Type":"ContainerDied","Data":"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c"} Apr 21 07:06:20.333792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8555fcc9b-nts68" event={"ID":"e287b8b8-650d-4318-ad39-c437f0b93ba3","Type":"ContainerDied","Data":"058ba7e281dc7e1c2eecc57e229315d44eac9173e9a3462d3522e69ae78eb905"} Apr 21 07:06:20.333792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.333531 2573 scope.go:117] "RemoveContainer" containerID="0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c" Apr 21 07:06:20.342058 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.342039 2573 scope.go:117] "RemoveContainer" containerID="0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c" Apr 21 07:06:20.342292 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:06:20.342276 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c\": container with ID starting with 0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c not found: ID does not exist" containerID="0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c" Apr 21 07:06:20.342349 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.342299 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c"} err="failed to get container status \"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c\": rpc error: code = NotFound desc = could not find container \"0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c\": container with ID starting with 0d77800c0a41fae4d51d3c6dc0cfe2f199a597efba1b804ec0f02af73f689d3c not found: ID does not exist" Apr 21 07:06:20.354648 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.354619 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:06:20.358554 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:20.358534 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8555fcc9b-nts68"] Apr 21 07:06:21.535641 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:21.535611 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e287b8b8-650d-4318-ad39-c437f0b93ba3" path="/var/lib/kubelet/pods/e287b8b8-650d-4318-ad39-c437f0b93ba3/volumes" Apr 21 07:06:30.370334 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.370289 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-97755bb4-v8dwm" podUID="b1469328-39d0-4277-857c-203b03aa0a7b" containerName="console" containerID="cri-o://b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b" gracePeriod=15 Apr 21 07:06:30.610902 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.610880 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97755bb4-v8dwm_b1469328-39d0-4277-857c-203b03aa0a7b/console/0.log" Apr 21 07:06:30.611001 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.610936 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.693959 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694040 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694110 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694187 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694242 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpwpr\" (UniqueName: \"kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.694868 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.694278 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert\") pod \"b1469328-39d0-4277-857c-203b03aa0a7b\" (UID: \"b1469328-39d0-4277-857c-203b03aa0a7b\") " Apr 21 07:06:30.695692 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.695399 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:30.696187 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.695895 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca" (OuterVolumeSpecName: "service-ca") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:30.698630 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.698562 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:30.698934 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.698910 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config" (OuterVolumeSpecName: "console-config") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:06:30.699009 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.698928 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:06:30.700546 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.700504 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr" (OuterVolumeSpecName: "kube-api-access-bpwpr") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "kube-api-access-bpwpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:06:30.700790 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.700769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b1469328-39d0-4277-857c-203b03aa0a7b" (UID: "b1469328-39d0-4277-857c-203b03aa0a7b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:06:30.795700 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795680 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-trusted-ca-bundle\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795704 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-console-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795714 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795723 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpwpr\" (UniqueName: \"kubernetes.io/projected/b1469328-39d0-4277-857c-203b03aa0a7b-kube-api-access-bpwpr\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795733 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-oauth-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795741 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1469328-39d0-4277-857c-203b03aa0a7b-service-ca\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:30.795789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:30.795749 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1469328-39d0-4277-857c-203b03aa0a7b-console-oauth-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:06:31.367101 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367075 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97755bb4-v8dwm_b1469328-39d0-4277-857c-203b03aa0a7b/console/0.log" Apr 21 07:06:31.367258 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367113 2573 generic.go:358] "Generic (PLEG): container finished" podID="b1469328-39d0-4277-857c-203b03aa0a7b" containerID="b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b" exitCode=2 Apr 21 07:06:31.367258 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97755bb4-v8dwm" Apr 21 07:06:31.367258 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97755bb4-v8dwm" event={"ID":"b1469328-39d0-4277-857c-203b03aa0a7b","Type":"ContainerDied","Data":"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b"} Apr 21 07:06:31.367258 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97755bb4-v8dwm" event={"ID":"b1469328-39d0-4277-857c-203b03aa0a7b","Type":"ContainerDied","Data":"4b1bce2bb29163156705c23ac5b1c05cc4e0ce76889fc0221a8ac05960a7f78f"} Apr 21 07:06:31.367416 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.367263 2573 scope.go:117] "RemoveContainer" containerID="b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b" Apr 21 07:06:31.375347 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.375147 2573 scope.go:117] "RemoveContainer" containerID="b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b" Apr 21 07:06:31.375597 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:06:31.375415 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b\": container with ID starting with b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b not found: ID does not exist" containerID="b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b" Apr 21 07:06:31.375597 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.375440 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b"} err="failed to get container status \"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b\": rpc error: code = NotFound desc = could not find container \"b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b\": container with ID starting with b28f360524ab1e2dff2bac84988425034dbf20c38e6ffb3da0c3f196d49f768b not found: ID does not exist" Apr 21 07:06:31.388452 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.388433 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:06:31.390443 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.390424 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-97755bb4-v8dwm"] Apr 21 07:06:31.535275 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:06:31.535249 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1469328-39d0-4277-857c-203b03aa0a7b" path="/var/lib/kubelet/pods/b1469328-39d0-4277-857c-203b03aa0a7b/volumes" Apr 21 07:07:14.984247 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984171 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8"] Apr 21 07:07:14.984663 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984581 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e287b8b8-650d-4318-ad39-c437f0b93ba3" containerName="console" Apr 21 07:07:14.984663 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984598 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e287b8b8-650d-4318-ad39-c437f0b93ba3" containerName="console" Apr 21 07:07:14.984663 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984628 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1469328-39d0-4277-857c-203b03aa0a7b" containerName="console" Apr 21 07:07:14.984663 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984637 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1469328-39d0-4277-857c-203b03aa0a7b" containerName="console" Apr 21 07:07:14.984789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984699 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1469328-39d0-4277-857c-203b03aa0a7b" containerName="console" Apr 21 07:07:14.984789 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.984710 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e287b8b8-650d-4318-ad39-c437f0b93ba3" containerName="console" Apr 21 07:07:14.987803 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.987786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:14.990299 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.990280 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 07:07:14.990402 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.990381 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 07:07:14.991204 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.991190 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t6wcq\"" Apr 21 07:07:14.996573 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:14.996552 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8"] Apr 21 07:07:15.092356 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.092332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.092460 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.092363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7t4\" (UniqueName: \"kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.092460 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.092390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.193056 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.193033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.193137 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.193063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7t4\" (UniqueName: \"kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.193137 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.193085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.193400 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.193383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.193441 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.193403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.201855 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.201829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7t4\" (UniqueName: \"kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.297606 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.297561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:15.412015 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.411954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8"] Apr 21 07:07:15.414543 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:07:15.414503 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a90c70d_f5db_4aa9_84b4_4febb7ad6d29.slice/crio-4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec WatchSource:0}: Error finding container 4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec: Status 404 returned error can't find the container with id 4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec Apr 21 07:07:15.489563 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:15.489530 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" event={"ID":"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29","Type":"ContainerStarted","Data":"4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec"} Apr 21 07:07:22.510538 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:22.510482 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerID="23f8e0e0dfb27135048682570ce3716b1e5a9d24618161e4fb91555f0d9ae883" exitCode=0 Apr 21 07:07:22.510904 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:22.510571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" event={"ID":"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29","Type":"ContainerDied","Data":"23f8e0e0dfb27135048682570ce3716b1e5a9d24618161e4fb91555f0d9ae883"} Apr 21 07:07:25.520583 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:25.520541 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerID="7e63275f3af57c7f3a80bba14c350c2f51b6fdac269c39557b6de9878ee42e89" exitCode=0 Apr 21 07:07:25.521028 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:25.520593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" event={"ID":"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29","Type":"ContainerDied","Data":"7e63275f3af57c7f3a80bba14c350c2f51b6fdac269c39557b6de9878ee42e89"} Apr 21 07:07:32.541920 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:32.541837 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerID="10d17734cef8c55a8e742e63c67e3f409cc302e90ed3a78276aab56bac4ffafc" exitCode=0 Apr 21 07:07:32.541920 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:32.541899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" event={"ID":"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29","Type":"ContainerDied","Data":"10d17734cef8c55a8e742e63c67e3f409cc302e90ed3a78276aab56bac4ffafc"} Apr 21 07:07:33.665158 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.665136 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:33.741061 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.741034 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle\") pod \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " Apr 21 07:07:33.741158 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.741072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util\") pod \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " Apr 21 07:07:33.741158 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.741101 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb7t4\" (UniqueName: \"kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4\") pod \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\" (UID: \"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29\") " Apr 21 07:07:33.741676 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.741650 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle" (OuterVolumeSpecName: "bundle") pod "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" (UID: "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:07:33.743138 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.743115 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4" (OuterVolumeSpecName: "kube-api-access-tb7t4") pod "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" (UID: "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29"). InnerVolumeSpecName "kube-api-access-tb7t4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:07:33.746573 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.746546 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util" (OuterVolumeSpecName: "util") pod "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" (UID: "2a90c70d-f5db-4aa9-84b4-4febb7ad6d29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:07:33.842214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.842160 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-util\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:07:33.842214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.842181 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tb7t4\" (UniqueName: \"kubernetes.io/projected/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-kube-api-access-tb7t4\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:07:33.842214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:33.842191 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a90c70d-f5db-4aa9-84b4-4febb7ad6d29-bundle\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:07:34.549067 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:34.549028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" event={"ID":"2a90c70d-f5db-4aa9-84b4-4febb7ad6d29","Type":"ContainerDied","Data":"4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec"} Apr 21 07:07:34.549067 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:34.549058 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9llz8" Apr 21 07:07:34.549262 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:34.549063 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b95b0c54f1e8fee240026a023f7a35fc40cd913de3bec44e06245983726e4ec" Apr 21 07:07:38.092374 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092339 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz"] Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092664 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="util" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092678 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="util" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092686 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="pull" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092691 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="pull" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092702 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="extract" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092708 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="extract" Apr 21 07:07:38.092793 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.092753 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a90c70d-f5db-4aa9-84b4-4febb7ad6d29" containerName="extract" Apr 21 07:07:38.099419 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.099401 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.101938 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.101915 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 07:07:38.102048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.101935 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:07:38.102048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.101989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6nwtt\"" Apr 21 07:07:38.108387 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.108364 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz"] Apr 21 07:07:38.170063 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.170037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7bz\" (UniqueName: \"kubernetes.io/projected/2c108650-e54b-4b36-a4f3-ed9cc10fba58-kube-api-access-fb7bz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.170163 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.170067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c108650-e54b-4b36-a4f3-ed9cc10fba58-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.271167 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.271141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7bz\" (UniqueName: \"kubernetes.io/projected/2c108650-e54b-4b36-a4f3-ed9cc10fba58-kube-api-access-fb7bz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.271262 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.271172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c108650-e54b-4b36-a4f3-ed9cc10fba58-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.271562 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.271547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c108650-e54b-4b36-a4f3-ed9cc10fba58-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.282410 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.282392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7bz\" (UniqueName: \"kubernetes.io/projected/2c108650-e54b-4b36-a4f3-ed9cc10fba58-kube-api-access-fb7bz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-bv8cz\" (UID: \"2c108650-e54b-4b36-a4f3-ed9cc10fba58\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.409812 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.409759 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" Apr 21 07:07:38.529904 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.529850 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz"] Apr 21 07:07:38.534105 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:07:38.534080 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c108650_e54b_4b36_a4f3_ed9cc10fba58.slice/crio-1cb3dd91c9202d6355d187bc29e4d678eab00aa45969133b621b8b2adacd5078 WatchSource:0}: Error finding container 1cb3dd91c9202d6355d187bc29e4d678eab00aa45969133b621b8b2adacd5078: Status 404 returned error can't find the container with id 1cb3dd91c9202d6355d187bc29e4d678eab00aa45969133b621b8b2adacd5078 Apr 21 07:07:38.560923 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:38.560897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" event={"ID":"2c108650-e54b-4b36-a4f3-ed9cc10fba58","Type":"ContainerStarted","Data":"1cb3dd91c9202d6355d187bc29e4d678eab00aa45969133b621b8b2adacd5078"} Apr 21 07:07:44.579946 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:44.579869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" event={"ID":"2c108650-e54b-4b36-a4f3-ed9cc10fba58","Type":"ContainerStarted","Data":"4504b1e42bf2767c717cfd884c8fcf878ac8f5daf6311f6b2c579f060ee11eb1"} Apr 21 07:07:44.602946 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:44.602891 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-bv8cz" podStartSLOduration=0.888720632 podStartE2EDuration="6.602876272s" podCreationTimestamp="2026-04-21 07:07:38 +0000 UTC" firstStartedPulling="2026-04-21 07:07:38.537265698 +0000 UTC m=+283.569316904" lastFinishedPulling="2026-04-21 07:07:44.25142133 +0000 UTC m=+289.283472544" observedRunningTime="2026-04-21 07:07:44.600467514 +0000 UTC m=+289.632518742" watchObservedRunningTime="2026-04-21 07:07:44.602876272 +0000 UTC m=+289.634927500" Apr 21 07:07:50.243917 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.243885 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9dgkj"] Apr 21 07:07:50.247417 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.247400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.251013 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.250991 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 07:07:50.251972 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.251952 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-256fw\"" Apr 21 07:07:50.252083 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.251952 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 07:07:50.266030 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.266006 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9dgkj"] Apr 21 07:07:50.352743 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.352718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.352840 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.352772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mlh7\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-kube-api-access-6mlh7\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.453209 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.453181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mlh7\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-kube-api-access-6mlh7\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.453300 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.453263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.461322 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.461296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.461503 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.461484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mlh7\" (UniqueName: \"kubernetes.io/projected/c8648e19-8912-4b46-b217-d4519ffcd733-kube-api-access-6mlh7\") pod \"cert-manager-cainjector-68b757865b-9dgkj\" (UID: \"c8648e19-8912-4b46-b217-d4519ffcd733\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.568820 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.568769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" Apr 21 07:07:50.689241 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:50.689214 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9dgkj"] Apr 21 07:07:50.692044 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:07:50.692015 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8648e19_8912_4b46_b217_d4519ffcd733.slice/crio-cfd7a73d065f8747086553c989de41c1ba2dbd44289e93c4ce34e597eb9086aa WatchSource:0}: Error finding container cfd7a73d065f8747086553c989de41c1ba2dbd44289e93c4ce34e597eb9086aa: Status 404 returned error can't find the container with id cfd7a73d065f8747086553c989de41c1ba2dbd44289e93c4ce34e597eb9086aa Apr 21 07:07:51.600801 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:51.600764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" event={"ID":"c8648e19-8912-4b46-b217-d4519ffcd733","Type":"ContainerStarted","Data":"cfd7a73d065f8747086553c989de41c1ba2dbd44289e93c4ce34e597eb9086aa"} Apr 21 07:07:52.667376 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.667343 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hxxwm"] Apr 21 07:07:52.671225 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.671200 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.674990 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.674967 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-tf2dr\"" Apr 21 07:07:52.685269 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.685231 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hxxwm"] Apr 21 07:07:52.770374 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.770341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.770553 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.770379 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn44c\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-kube-api-access-sn44c\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.870940 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.870904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.871095 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.870953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn44c\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-kube-api-access-sn44c\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.881042 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.881018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.881257 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.881235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn44c\" (UniqueName: \"kubernetes.io/projected/651b985f-6d25-4288-adea-a9bb833bdb6d-kube-api-access-sn44c\") pod \"cert-manager-webhook-587ccfb98-hxxwm\" (UID: \"651b985f-6d25-4288-adea-a9bb833bdb6d\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:52.983175 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:52.983144 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:53.491186 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.491160 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hxxwm"] Apr 21 07:07:53.493044 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:07:53.493021 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod651b985f_6d25_4288_adea_a9bb833bdb6d.slice/crio-7d38813c9e6efe25c61562dc0981e0de1b210156221c91c09f2edbbaf901b207 WatchSource:0}: Error finding container 7d38813c9e6efe25c61562dc0981e0de1b210156221c91c09f2edbbaf901b207: Status 404 returned error can't find the container with id 7d38813c9e6efe25c61562dc0981e0de1b210156221c91c09f2edbbaf901b207 Apr 21 07:07:53.608798 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.608759 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" event={"ID":"c8648e19-8912-4b46-b217-d4519ffcd733","Type":"ContainerStarted","Data":"ec425af3164183931f16a9ca952cbd136480ba326fa914be3777ba49313a53d4"} Apr 21 07:07:53.610090 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.610054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" event={"ID":"651b985f-6d25-4288-adea-a9bb833bdb6d","Type":"ContainerStarted","Data":"68a2451fd9c6af318c7f43a154b6e75a78464dd7f7ace14f914843e4065a5d97"} Apr 21 07:07:53.610214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.610102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" event={"ID":"651b985f-6d25-4288-adea-a9bb833bdb6d","Type":"ContainerStarted","Data":"7d38813c9e6efe25c61562dc0981e0de1b210156221c91c09f2edbbaf901b207"} Apr 21 07:07:53.610214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.610133 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:07:53.628792 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.628704 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-9dgkj" podStartSLOduration=0.894498363 podStartE2EDuration="3.628691311s" podCreationTimestamp="2026-04-21 07:07:50 +0000 UTC" firstStartedPulling="2026-04-21 07:07:50.694263343 +0000 UTC m=+295.726314552" lastFinishedPulling="2026-04-21 07:07:53.428456283 +0000 UTC m=+298.460507500" observedRunningTime="2026-04-21 07:07:53.627388159 +0000 UTC m=+298.659439387" watchObservedRunningTime="2026-04-21 07:07:53.628691311 +0000 UTC m=+298.660742538" Apr 21 07:07:53.655487 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:53.655434 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" podStartSLOduration=1.655417731 podStartE2EDuration="1.655417731s" podCreationTimestamp="2026-04-21 07:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:07:53.65030503 +0000 UTC m=+298.682356259" watchObservedRunningTime="2026-04-21 07:07:53.655417731 +0000 UTC m=+298.687468960" Apr 21 07:07:55.398958 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:55.398927 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:07:55.399583 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:55.399055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:07:55.409346 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:55.409322 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 07:07:59.615931 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:07:59.615903 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-hxxwm" Apr 21 07:08:05.829761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.829729 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cwrlj"] Apr 21 07:08:05.832721 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.832705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:05.835177 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.835158 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-cnkr5\"" Apr 21 07:08:05.840085 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.840067 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cwrlj"] Apr 21 07:08:05.958271 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.958248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:05.958362 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:05.958281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75bl\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-kube-api-access-l75bl\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.058839 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.058814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.058930 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.058848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l75bl\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-kube-api-access-l75bl\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.068499 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.068463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-bound-sa-token\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.068597 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.068549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75bl\" (UniqueName: \"kubernetes.io/projected/f76826c5-35c5-4887-8e04-308f650df7a4-kube-api-access-l75bl\") pod \"cert-manager-79c8d999ff-cwrlj\" (UID: \"f76826c5-35c5-4887-8e04-308f650df7a4\") " pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.142449 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.142382 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-cwrlj" Apr 21 07:08:06.264788 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.264764 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cwrlj"] Apr 21 07:08:06.267091 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:08:06.267065 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf76826c5_35c5_4887_8e04_308f650df7a4.slice/crio-f8ac3a2dd9129cd4bae4af139a8e90d167c7ceac8d8b24ff1660c90bdbbb8667 WatchSource:0}: Error finding container f8ac3a2dd9129cd4bae4af139a8e90d167c7ceac8d8b24ff1660c90bdbbb8667: Status 404 returned error can't find the container with id f8ac3a2dd9129cd4bae4af139a8e90d167c7ceac8d8b24ff1660c90bdbbb8667 Apr 21 07:08:06.269015 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.268998 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:08:06.650740 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.650706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-cwrlj" event={"ID":"f76826c5-35c5-4887-8e04-308f650df7a4","Type":"ContainerStarted","Data":"cdcdf1d49228568cb223f4ca5d92d1e81ca1637b0f80918f7373fa76d0736d79"} Apr 21 07:08:06.650740 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.650744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-cwrlj" event={"ID":"f76826c5-35c5-4887-8e04-308f650df7a4","Type":"ContainerStarted","Data":"f8ac3a2dd9129cd4bae4af139a8e90d167c7ceac8d8b24ff1660c90bdbbb8667"} Apr 21 07:08:06.667926 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:06.667882 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-cwrlj" podStartSLOduration=1.66786989 podStartE2EDuration="1.66786989s" podCreationTimestamp="2026-04-21 07:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:08:06.667351437 +0000 UTC m=+311.699402666" watchObservedRunningTime="2026-04-21 07:08:06.66786989 +0000 UTC m=+311.699921118" Apr 21 07:08:07.773364 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.773329 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw"] Apr 21 07:08:07.776539 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.776522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.779164 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.779143 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 07:08:07.779276 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.779254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t6wcq\"" Apr 21 07:08:07.780183 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.780168 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 07:08:07.785720 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.785699 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw"] Apr 21 07:08:07.872820 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.872797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.872918 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.872865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.872918 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.872897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8x69\" (UniqueName: \"kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.973741 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.973719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.973845 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.973775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.973845 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.973799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8x69\" (UniqueName: \"kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.974087 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.974059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.974137 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.974120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:07.982496 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:07.982468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8x69\" (UniqueName: \"kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:08.086624 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:08.086572 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:08.208774 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:08.208745 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw"] Apr 21 07:08:08.210157 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:08:08.210122 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972ec9c8_6ce5_4ab7_ba41_f4993430ee80.slice/crio-0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f WatchSource:0}: Error finding container 0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f: Status 404 returned error can't find the container with id 0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f Apr 21 07:08:08.658595 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:08.658566 2573 generic.go:358] "Generic (PLEG): container finished" podID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerID="e2e233e75aa89236b6d9c6c5201d6bf41ead2d3c02c76333617c0bb3ecea42e8" exitCode=0 Apr 21 07:08:08.658747 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:08.658652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" event={"ID":"972ec9c8-6ce5-4ab7-ba41-f4993430ee80","Type":"ContainerDied","Data":"e2e233e75aa89236b6d9c6c5201d6bf41ead2d3c02c76333617c0bb3ecea42e8"} Apr 21 07:08:08.658747 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:08.658683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" event={"ID":"972ec9c8-6ce5-4ab7-ba41-f4993430ee80","Type":"ContainerStarted","Data":"0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f"} Apr 21 07:08:11.675070 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:11.675034 2573 generic.go:358] "Generic (PLEG): container finished" podID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerID="c86ca02f5b36b81350df6c1c91680d71ebb0e1868e6f75255e4563d6ab57dbdc" exitCode=0 Apr 21 07:08:11.675467 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:11.675076 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" event={"ID":"972ec9c8-6ce5-4ab7-ba41-f4993430ee80","Type":"ContainerDied","Data":"c86ca02f5b36b81350df6c1c91680d71ebb0e1868e6f75255e4563d6ab57dbdc"} Apr 21 07:08:12.681257 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:12.681225 2573 generic.go:358] "Generic (PLEG): container finished" podID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerID="0e2821f5c17c2e796adb45450949f48c64c6d65c5c4890c6b4c5b12b5422260a" exitCode=0 Apr 21 07:08:12.681688 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:12.681297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" event={"ID":"972ec9c8-6ce5-4ab7-ba41-f4993430ee80","Type":"ContainerDied","Data":"0e2821f5c17c2e796adb45450949f48c64c6d65c5c4890c6b4c5b12b5422260a"} Apr 21 07:08:13.811310 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.811287 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:13.919879 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.919852 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util\") pod \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " Apr 21 07:08:13.920026 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.919920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8x69\" (UniqueName: \"kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69\") pod \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " Apr 21 07:08:13.920026 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.919968 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle\") pod \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\" (UID: \"972ec9c8-6ce5-4ab7-ba41-f4993430ee80\") " Apr 21 07:08:13.920399 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.920371 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle" (OuterVolumeSpecName: "bundle") pod "972ec9c8-6ce5-4ab7-ba41-f4993430ee80" (UID: "972ec9c8-6ce5-4ab7-ba41-f4993430ee80"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:08:13.922192 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.922169 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69" (OuterVolumeSpecName: "kube-api-access-f8x69") pod "972ec9c8-6ce5-4ab7-ba41-f4993430ee80" (UID: "972ec9c8-6ce5-4ab7-ba41-f4993430ee80"). InnerVolumeSpecName "kube-api-access-f8x69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:08:13.924181 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:13.924150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util" (OuterVolumeSpecName: "util") pod "972ec9c8-6ce5-4ab7-ba41-f4993430ee80" (UID: "972ec9c8-6ce5-4ab7-ba41-f4993430ee80"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:08:14.020652 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.020628 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-util\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:08:14.020652 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.020651 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8x69\" (UniqueName: \"kubernetes.io/projected/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-kube-api-access-f8x69\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:08:14.020774 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.020661 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/972ec9c8-6ce5-4ab7-ba41-f4993430ee80-bundle\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:08:14.688899 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.688874 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" Apr 21 07:08:14.689048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.688894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ewxdvw" event={"ID":"972ec9c8-6ce5-4ab7-ba41-f4993430ee80","Type":"ContainerDied","Data":"0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f"} Apr 21 07:08:14.689048 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:08:14.688923 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0435261106e1f37919a7237bd6354c6a7e9a59f4bb087b0e5221d98265d9981f" Apr 21 07:10:46.292431 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292397 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-97784d49c-2s8qx"] Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292721 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="util" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292733 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="util" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292743 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="extract" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292748 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="extract" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292755 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="pull" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292760 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="pull" Apr 21 07:10:46.293110 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.292824 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="972ec9c8-6ce5-4ab7-ba41-f4993430ee80" containerName="extract" Apr 21 07:10:46.295853 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.295829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.309156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.309131 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97784d49c-2s8qx"] Apr 21 07:10:46.395662 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-service-ca\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.395825 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-console-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.395825 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395720 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.395825 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-oauth-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.395825 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-trusted-ca-bundle\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.395825 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-oauth-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.396029 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.395832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdpd4\" (UniqueName: \"kubernetes.io/projected/f2f28879-c606-442a-aa11-308f66af2653-kube-api-access-rdpd4\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.496830 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-oauth-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.496830 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdpd4\" (UniqueName: \"kubernetes.io/projected/f2f28879-c606-442a-aa11-308f66af2653-kube-api-access-rdpd4\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497086 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-service-ca\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497086 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-console-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497086 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497086 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.496967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-oauth-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497283 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.497107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-trusted-ca-bundle\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497735 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.497711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-oauth-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.497763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-service-ca\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.497849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-console-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.497977 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.497951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2f28879-c606-442a-aa11-308f66af2653-trusted-ca-bundle\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.499486 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.499456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-oauth-config\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.499603 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.499546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f28879-c606-442a-aa11-308f66af2653-console-serving-cert\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.507232 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.507208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdpd4\" (UniqueName: \"kubernetes.io/projected/f2f28879-c606-442a-aa11-308f66af2653-kube-api-access-rdpd4\") pod \"console-97784d49c-2s8qx\" (UID: \"f2f28879-c606-442a-aa11-308f66af2653\") " pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.605197 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.605097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:46.731847 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:46.731821 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97784d49c-2s8qx"] Apr 21 07:10:46.733809 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:10:46.733781 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f28879_c606_442a_aa11_308f66af2653.slice/crio-ac85179613cc017f64f21df13217a53d2b8155256aa671bd17b711b52004358e WatchSource:0}: Error finding container ac85179613cc017f64f21df13217a53d2b8155256aa671bd17b711b52004358e: Status 404 returned error can't find the container with id ac85179613cc017f64f21df13217a53d2b8155256aa671bd17b711b52004358e Apr 21 07:10:47.200698 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:47.200662 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97784d49c-2s8qx" event={"ID":"f2f28879-c606-442a-aa11-308f66af2653","Type":"ContainerStarted","Data":"7342e3801818b25051a4ed7e474c5f148e65ec3a7bcd50dca4b035dcfd71c416"} Apr 21 07:10:47.200874 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:47.200705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97784d49c-2s8qx" event={"ID":"f2f28879-c606-442a-aa11-308f66af2653","Type":"ContainerStarted","Data":"ac85179613cc017f64f21df13217a53d2b8155256aa671bd17b711b52004358e"} Apr 21 07:10:47.233113 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:47.233065 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-97784d49c-2s8qx" podStartSLOduration=1.233049871 podStartE2EDuration="1.233049871s" podCreationTimestamp="2026-04-21 07:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:10:47.232367237 +0000 UTC m=+472.264418478" watchObservedRunningTime="2026-04-21 07:10:47.233049871 +0000 UTC m=+472.265101098" Apr 21 07:10:56.606035 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:56.605930 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:56.606035 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:56.606004 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:56.610501 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:56.610479 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:57.239349 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:57.239312 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-97784d49c-2s8qx" Apr 21 07:10:57.303591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:10:57.303562 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:11:22.323646 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.323589 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65f995cd7c-tprdx" podUID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" containerName="console" containerID="cri-o://a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd" gracePeriod=15 Apr 21 07:11:22.563873 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.563848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65f995cd7c-tprdx_f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e/console/0.log" Apr 21 07:11:22.564006 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.563920 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:11:22.605456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605379 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h965n\" (UniqueName: \"kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605419 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605456 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605451 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605718 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605487 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605718 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605526 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605718 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605560 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.605718 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605607 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca\") pod \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\" (UID: \"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e\") " Apr 21 07:11:22.606029 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.605999 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:11:22.606179 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.606152 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:11:22.606253 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.606175 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config" (OuterVolumeSpecName: "console-config") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:11:22.606253 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.606101 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca" (OuterVolumeSpecName: "service-ca") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:11:22.607776 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.607734 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:11:22.608097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.608072 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n" (OuterVolumeSpecName: "kube-api-access-h965n") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "kube-api-access-h965n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:11:22.608097 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.608074 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" (UID: "f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:11:22.706932 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706899 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h965n\" (UniqueName: \"kubernetes.io/projected/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-kube-api-access-h965n\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.706932 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706930 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-trusted-ca-bundle\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.706932 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706939 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-oauth-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.707153 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706947 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-oauth-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.707153 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706957 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-serving-cert\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.707153 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706966 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-console-config\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:22.707153 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:22.706975 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e-service-ca\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:11:23.325488 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325458 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65f995cd7c-tprdx_f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e/console/0.log" Apr 21 07:11:23.325979 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325497 2573 generic.go:358] "Generic (PLEG): container finished" podID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" containerID="a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd" exitCode=2 Apr 21 07:11:23.325979 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325593 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f995cd7c-tprdx" Apr 21 07:11:23.325979 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f995cd7c-tprdx" event={"ID":"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e","Type":"ContainerDied","Data":"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd"} Apr 21 07:11:23.325979 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f995cd7c-tprdx" event={"ID":"f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e","Type":"ContainerDied","Data":"aa3402488235691f0716274da09fa372d54bff0b86a2ebd53af3b7f1da979af6"} Apr 21 07:11:23.325979 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.325651 2573 scope.go:117] "RemoveContainer" containerID="a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd" Apr 21 07:11:23.334216 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.334199 2573 scope.go:117] "RemoveContainer" containerID="a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd" Apr 21 07:11:23.334468 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:11:23.334447 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd\": container with ID starting with a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd not found: ID does not exist" containerID="a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd" Apr 21 07:11:23.334545 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.334476 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd"} err="failed to get container status \"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd\": rpc error: code = NotFound desc = could not find container \"a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd\": container with ID starting with a68213e0ba82dbf252861ca4a64488e15df178a8a034f1c58c1b8f7dd8ca58dd not found: ID does not exist" Apr 21 07:11:23.348544 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.348521 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:11:23.355305 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.355287 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65f995cd7c-tprdx"] Apr 21 07:11:23.536693 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:11:23.536657 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" path="/var/lib/kubelet/pods/f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e/volumes" Apr 21 07:12:55.421301 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:12:55.421272 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:12:55.423427 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:12:55.423403 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:14:05.608695 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.608661 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc"] Apr 21 07:14:05.609115 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.608955 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" containerName="console" Apr 21 07:14:05.609115 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.608965 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" containerName="console" Apr 21 07:14:05.609115 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.609025 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0ae8e48-9b0c-4c32-9bf8-6f49ea1e956e" containerName="console" Apr 21 07:14:05.611685 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.611669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:14:05.614122 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.614093 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"openshift-service-ca.crt\"" Apr 21 07:14:05.614259 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.614192 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"default-dockercfg-lkwr7\"" Apr 21 07:14:05.614259 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.614194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"kube-root-ca.crt\"" Apr 21 07:14:05.621445 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.621421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc"] Apr 21 07:14:05.744888 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.744851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjtv\" (UniqueName: \"kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv\") pod \"progression-custom-config-node-0-0-7svkc\" (UID: \"3ae14885-0f42-4a44-bb04-b7a08b3308d8\") " pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:14:05.845455 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.845417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjtv\" (UniqueName: \"kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv\") pod \"progression-custom-config-node-0-0-7svkc\" (UID: \"3ae14885-0f42-4a44-bb04-b7a08b3308d8\") " pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:14:05.853755 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.853719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjtv\" (UniqueName: \"kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv\") pod \"progression-custom-config-node-0-0-7svkc\" (UID: \"3ae14885-0f42-4a44-bb04-b7a08b3308d8\") " pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:14:05.921155 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:05.921069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:14:06.047543 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:06.047452 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc"] Apr 21 07:14:06.049878 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:14:06.049834 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae14885_0f42_4a44_bb04_b7a08b3308d8.slice/crio-0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f WatchSource:0}: Error finding container 0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f: Status 404 returned error can't find the container with id 0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f Apr 21 07:14:06.052068 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:06.052052 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:14:06.887866 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:14:06.887821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" event={"ID":"3ae14885-0f42-4a44-bb04-b7a08b3308d8","Type":"ContainerStarted","Data":"0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f"} Apr 21 07:15:58.296370 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:15:58.296332 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" event={"ID":"3ae14885-0f42-4a44-bb04-b7a08b3308d8","Type":"ContainerStarted","Data":"4c3b42ac14891c96ae7a99fa9299c3636ae51672ce0e007e4009d826bcf25a21"} Apr 21 07:15:58.296939 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:15:58.296444 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:15:58.314780 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:15:58.314716 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" podStartSLOduration=2.176679046 podStartE2EDuration="1m53.314697888s" podCreationTimestamp="2026-04-21 07:14:05 +0000 UTC" firstStartedPulling="2026-04-21 07:14:06.052177841 +0000 UTC m=+671.084229047" lastFinishedPulling="2026-04-21 07:15:57.190196684 +0000 UTC m=+782.222247889" observedRunningTime="2026-04-21 07:15:58.312731704 +0000 UTC m=+783.344782932" watchObservedRunningTime="2026-04-21 07:15:58.314697888 +0000 UTC m=+783.346749118" Apr 21 07:15:59.299188 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:15:59.299157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:16:20.313487 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:20.313369 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" podUID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" containerName="node" probeResult="failure" output="Get \"http://10.132.0.33:28080/metrics\": read tcp 10.132.0.2:59454->10.132.0.33:28080: read: connection reset by peer" Apr 21 07:16:20.368310 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:20.368279 2573 generic.go:358] "Generic (PLEG): container finished" podID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" containerID="4c3b42ac14891c96ae7a99fa9299c3636ae51672ce0e007e4009d826bcf25a21" exitCode=0 Apr 21 07:16:20.368445 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:20.368355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" event={"ID":"3ae14885-0f42-4a44-bb04-b7a08b3308d8","Type":"ContainerDied","Data":"4c3b42ac14891c96ae7a99fa9299c3636ae51672ce0e007e4009d826bcf25a21"} Apr 21 07:16:21.490383 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:21.490361 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:16:21.578254 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:21.578227 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjtv\" (UniqueName: \"kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv\") pod \"3ae14885-0f42-4a44-bb04-b7a08b3308d8\" (UID: \"3ae14885-0f42-4a44-bb04-b7a08b3308d8\") " Apr 21 07:16:21.580315 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:21.580290 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv" (OuterVolumeSpecName: "kube-api-access-ktjtv") pod "3ae14885-0f42-4a44-bb04-b7a08b3308d8" (UID: "3ae14885-0f42-4a44-bb04-b7a08b3308d8"). InnerVolumeSpecName "kube-api-access-ktjtv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:16:21.678937 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:21.678882 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktjtv\" (UniqueName: \"kubernetes.io/projected/3ae14885-0f42-4a44-bb04-b7a08b3308d8-kube-api-access-ktjtv\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:16:22.377401 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.377360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" event={"ID":"3ae14885-0f42-4a44-bb04-b7a08b3308d8","Type":"ContainerDied","Data":"0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f"} Apr 21 07:16:22.377401 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.377398 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb5dbb7557c2ba99073e6abdb5bdc1adc007508492de51d5a862bd260606c1f" Apr 21 07:16:22.377401 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.377433 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc" Apr 21 07:16:22.466863 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.466832 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n589c/must-gather-cfsjp"] Apr 21 07:16:22.467120 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.467109 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" containerName="node" Apr 21 07:16:22.467168 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.467121 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" containerName="node" Apr 21 07:16:22.467202 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.467187 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" containerName="node" Apr 21 07:16:22.494504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.494474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n589c/must-gather-cfsjp"] Apr 21 07:16:22.494900 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.494536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.496988 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.496964 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-n589c\"/\"kube-root-ca.crt\"" Apr 21 07:16:22.497120 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.496972 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-n589c\"/\"default-dockercfg-48cds\"" Apr 21 07:16:22.497120 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.496994 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-n589c\"/\"openshift-service-ca.crt\"" Apr 21 07:16:22.585627 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.585601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.585778 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.585659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtpsq\" (UniqueName: \"kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.686232 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.686152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.686232 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.686206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtpsq\" (UniqueName: \"kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.686494 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.686472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.695106 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.695081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtpsq\" (UniqueName: \"kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq\") pod \"must-gather-cfsjp\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.804077 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.804045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:16:22.922288 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:22.922235 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n589c/must-gather-cfsjp"] Apr 21 07:16:22.925749 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:16:22.925712 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf270efda_727f_4d9e_850f_e86e4dbeb7d5.slice/crio-7b3d23b1ff0a453cade11c22f3b04cf5240e8d794e04cde6a132d632d706445a WatchSource:0}: Error finding container 7b3d23b1ff0a453cade11c22f3b04cf5240e8d794e04cde6a132d632d706445a: Status 404 returned error can't find the container with id 7b3d23b1ff0a453cade11c22f3b04cf5240e8d794e04cde6a132d632d706445a Apr 21 07:16:23.381656 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:23.381623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n589c/must-gather-cfsjp" event={"ID":"f270efda-727f-4d9e-850f-e86e4dbeb7d5","Type":"ContainerStarted","Data":"7b3d23b1ff0a453cade11c22f3b04cf5240e8d794e04cde6a132d632d706445a"} Apr 21 07:16:27.229247 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:27.229214 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc"] Apr 21 07:16:27.231658 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:27.231631 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-custom-config-node-0-0-7svkc"] Apr 21 07:16:27.536248 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:27.536177 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae14885-0f42-4a44-bb04-b7a08b3308d8" path="/var/lib/kubelet/pods/3ae14885-0f42-4a44-bb04-b7a08b3308d8/volumes" Apr 21 07:16:30.409391 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:30.409357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n589c/must-gather-cfsjp" event={"ID":"f270efda-727f-4d9e-850f-e86e4dbeb7d5","Type":"ContainerStarted","Data":"476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1"} Apr 21 07:16:30.409391 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:30.409394 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n589c/must-gather-cfsjp" event={"ID":"f270efda-727f-4d9e-850f-e86e4dbeb7d5","Type":"ContainerStarted","Data":"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728"} Apr 21 07:16:30.424639 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:16:30.424583 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n589c/must-gather-cfsjp" podStartSLOduration=1.966929522 podStartE2EDuration="8.424564021s" podCreationTimestamp="2026-04-21 07:16:22 +0000 UTC" firstStartedPulling="2026-04-21 07:16:22.927647631 +0000 UTC m=+807.959698846" lastFinishedPulling="2026-04-21 07:16:29.385282131 +0000 UTC m=+814.417333345" observedRunningTime="2026-04-21 07:16:30.423783378 +0000 UTC m=+815.455834606" watchObservedRunningTime="2026-04-21 07:16:30.424564021 +0000 UTC m=+815.456615250" Apr 21 07:17:15.579162 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:15.579129 2573 generic.go:358] "Generic (PLEG): container finished" podID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerID="4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728" exitCode=0 Apr 21 07:17:15.579589 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:15.579202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n589c/must-gather-cfsjp" event={"ID":"f270efda-727f-4d9e-850f-e86e4dbeb7d5","Type":"ContainerDied","Data":"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728"} Apr 21 07:17:15.579589 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:15.579475 2573 scope.go:117] "RemoveContainer" containerID="4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728" Apr 21 07:17:15.822336 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:15.822304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n589c_must-gather-cfsjp_f270efda-727f-4d9e-850f-e86e4dbeb7d5/gather/0.log" Apr 21 07:17:19.117494 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:19.117465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mqxbs_240ddbe6-7b0f-4f03-9c28-38b3756ea88b/global-pull-secret-syncer/0.log" Apr 21 07:17:19.240239 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:19.240206 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xn7rw_23dada4b-3bff-4763-9499-d08a34391b70/konnectivity-agent/0.log" Apr 21 07:17:19.308334 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:19.308305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-69.ec2.internal_192bfeaa4c26d06d04fe2b9437ecbb37/haproxy/0.log" Apr 21 07:17:21.193675 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.193642 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n589c/must-gather-cfsjp"] Apr 21 07:17:21.194063 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.193848 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-n589c/must-gather-cfsjp" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="copy" containerID="cri-o://476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1" gracePeriod=2 Apr 21 07:17:21.196007 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.195968 2573 status_manager.go:895] "Failed to get status for pod" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" pod="openshift-must-gather-n589c/must-gather-cfsjp" err="pods \"must-gather-cfsjp\" is forbidden: User \"system:node:ip-10-0-143-69.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-n589c\": no relationship found between node 'ip-10-0-143-69.ec2.internal' and this object" Apr 21 07:17:21.196543 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.196524 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n589c/must-gather-cfsjp"] Apr 21 07:17:21.422409 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.422386 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n589c_must-gather-cfsjp_f270efda-727f-4d9e-850f-e86e4dbeb7d5/copy/0.log" Apr 21 07:17:21.422768 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.422751 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:17:21.425400 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.425374 2573 status_manager.go:895] "Failed to get status for pod" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" pod="openshift-must-gather-n589c/must-gather-cfsjp" err="pods \"must-gather-cfsjp\" is forbidden: User \"system:node:ip-10-0-143-69.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-n589c\": no relationship found between node 'ip-10-0-143-69.ec2.internal' and this object" Apr 21 07:17:21.493591 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.493570 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtpsq\" (UniqueName: \"kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq\") pod \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " Apr 21 07:17:21.493707 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.493621 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output\") pod \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\" (UID: \"f270efda-727f-4d9e-850f-e86e4dbeb7d5\") " Apr 21 07:17:21.495708 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.495682 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq" (OuterVolumeSpecName: "kube-api-access-qtpsq") pod "f270efda-727f-4d9e-850f-e86e4dbeb7d5" (UID: "f270efda-727f-4d9e-850f-e86e4dbeb7d5"). InnerVolumeSpecName "kube-api-access-qtpsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:17:21.495796 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.495778 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f270efda-727f-4d9e-850f-e86e4dbeb7d5" (UID: "f270efda-727f-4d9e-850f-e86e4dbeb7d5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:17:21.536193 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.536167 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" path="/var/lib/kubelet/pods/f270efda-727f-4d9e-850f-e86e4dbeb7d5/volumes" Apr 21 07:17:21.594247 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.594227 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtpsq\" (UniqueName: \"kubernetes.io/projected/f270efda-727f-4d9e-850f-e86e4dbeb7d5-kube-api-access-qtpsq\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:17:21.594247 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.594246 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f270efda-727f-4d9e-850f-e86e4dbeb7d5-must-gather-output\") on node \"ip-10-0-143-69.ec2.internal\" DevicePath \"\"" Apr 21 07:17:21.598656 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.598634 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n589c_must-gather-cfsjp_f270efda-727f-4d9e-850f-e86e4dbeb7d5/copy/0.log" Apr 21 07:17:21.598938 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.598915 2573 generic.go:358] "Generic (PLEG): container finished" podID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerID="476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1" exitCode=143 Apr 21 07:17:21.599000 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.598966 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n589c/must-gather-cfsjp" Apr 21 07:17:21.599054 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.599012 2573 scope.go:117] "RemoveContainer" containerID="476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1" Apr 21 07:17:21.606216 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.606198 2573 scope.go:117] "RemoveContainer" containerID="4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728" Apr 21 07:17:21.617910 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.617895 2573 scope.go:117] "RemoveContainer" containerID="476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1" Apr 21 07:17:21.618157 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:17:21.618139 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1\": container with ID starting with 476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1 not found: ID does not exist" containerID="476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1" Apr 21 07:17:21.618214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.618164 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1"} err="failed to get container status \"476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1\": rpc error: code = NotFound desc = could not find container \"476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1\": container with ID starting with 476499d5a5432e70c9cd30394f0f3843ed6dede4c1a45adf6408b705f7731bb1 not found: ID does not exist" Apr 21 07:17:21.618214 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.618181 2573 scope.go:117] "RemoveContainer" containerID="4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728" Apr 21 07:17:21.618404 ip-10-0-143-69 kubenswrapper[2573]: E0421 07:17:21.618390 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728\": container with ID starting with 4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728 not found: ID does not exist" containerID="4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728" Apr 21 07:17:21.618447 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:21.618407 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728"} err="failed to get container status \"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728\": rpc error: code = NotFound desc = could not find container \"4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728\": container with ID starting with 4e105d26bc52729ed80b49fb23c5f49cdd2d4eaf0e93ebdd88033972a79da728 not found: ID does not exist" Apr 21 07:17:22.317524 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:22.317478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6rh5r_71f6c66b-8c36-497f-9098-f070725c4d1d/cluster-monitoring-operator/0.log" Apr 21 07:17:22.505804 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:22.505777 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-r4jlc_1fe2d2b7-47fc-4a5d-a8b5-19fc771065e8/monitoring-plugin/0.log" Apr 21 07:17:22.549224 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:22.549199 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8sjbz_84902026-6f14-435c-bdd6-a7e07f14ac16/node-exporter/0.log" Apr 21 07:17:22.573458 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:22.573410 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8sjbz_84902026-6f14-435c-bdd6-a7e07f14ac16/kube-rbac-proxy/0.log" Apr 21 07:17:22.602102 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:22.602079 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8sjbz_84902026-6f14-435c-bdd6-a7e07f14ac16/init-textfile/0.log" Apr 21 07:17:23.074854 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:23.074828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-stjpk_b0249fec-358f-462d-9041-e00bf841cdd3/prometheus-operator/0.log" Apr 21 07:17:23.099450 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:23.099424 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-stjpk_b0249fec-358f-462d-9041-e00bf841cdd3/kube-rbac-proxy/0.log" Apr 21 07:17:23.125841 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:23.125819 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-87n45_f8f070b4-00c8-456b-802c-4794f5d87b21/prometheus-operator-admission-webhook/0.log" Apr 21 07:17:24.539808 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:24.539777 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-d8wdr_fd0f8ab2-d283-4079-8826-80cb40b62cab/networking-console-plugin/0.log" Apr 21 07:17:25.010323 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:25.010292 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/1.log" Apr 21 07:17:25.015727 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:25.015707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4gb9q_22ef3159-4fb3-4a8b-8264-e9ee14be3a04/console-operator/2.log" Apr 21 07:17:25.404761 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:25.404688 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97784d49c-2s8qx_f2f28879-c606-442a-aa11-308f66af2653/console/0.log" Apr 21 07:17:25.836707 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:25.836678 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-9kqbb_8da13794-b67d-4df5-9370-57ce6358959a/volume-data-source-validator/0.log" Apr 21 07:17:26.432067 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432036 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2"] Apr 21 07:17:26.432342 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432328 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="copy" Apr 21 07:17:26.432399 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432344 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="copy" Apr 21 07:17:26.432399 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432365 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="gather" Apr 21 07:17:26.432399 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432372 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="gather" Apr 21 07:17:26.432490 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432419 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="copy" Apr 21 07:17:26.432490 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.432431 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f270efda-727f-4d9e-850f-e86e4dbeb7d5" containerName="gather" Apr 21 07:17:26.437553 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.437533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.439685 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.439660 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nj2kz\"/\"default-dockercfg-rwzml\"" Apr 21 07:17:26.440600 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.440585 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj2kz\"/\"kube-root-ca.crt\"" Apr 21 07:17:26.440658 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.440639 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj2kz\"/\"openshift-service-ca.crt\"" Apr 21 07:17:26.447555 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.447536 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2"] Apr 21 07:17:26.531742 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.531709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-proc\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.531916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.531749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-podres\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.531916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.531783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-lib-modules\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.531916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.531810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tnk\" (UniqueName: \"kubernetes.io/projected/8df52981-4134-4392-b56f-0771cec607de-kube-api-access-c6tnk\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.531916 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.531906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-sys\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.534078 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.534057 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wknz9_4519c586-5721-4cb3-bffc-7f4b13237ef7/dns/0.log" Apr 21 07:17:26.554653 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.554634 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wknz9_4519c586-5721-4cb3-bffc-7f4b13237ef7/kube-rbac-proxy/0.log" Apr 21 07:17:26.632582 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-proc\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632686 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-podres\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632686 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-proc\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-podres\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632787 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-lib-modules\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632883 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tnk\" (UniqueName: \"kubernetes.io/projected/8df52981-4134-4392-b56f-0771cec607de-kube-api-access-c6tnk\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632936 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-lib-modules\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.632980 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.632936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-sys\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.633056 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.633041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8df52981-4134-4392-b56f-0771cec607de-sys\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.640408 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.640385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tnk\" (UniqueName: \"kubernetes.io/projected/8df52981-4134-4392-b56f-0771cec607de-kube-api-access-c6tnk\") pod \"perf-node-gather-daemonset-59rd2\" (UID: \"8df52981-4134-4392-b56f-0771cec607de\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.671225 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.671202 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q7rfs_2f25cb44-1f59-45ee-8bd4-d80ef4c1366b/dns-node-resolver/0.log" Apr 21 07:17:26.748330 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.748309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:26.873186 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:26.873159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2"] Apr 21 07:17:26.875199 ip-10-0-143-69 kubenswrapper[2573]: W0421 07:17:26.875170 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8df52981_4134_4392_b56f_0771cec607de.slice/crio-969983ea76799f6d3f7ab6d68b5c042b830cacd6312c0809431d4851e623daa7 WatchSource:0}: Error finding container 969983ea76799f6d3f7ab6d68b5c042b830cacd6312c0809431d4851e623daa7: Status 404 returned error can't find the container with id 969983ea76799f6d3f7ab6d68b5c042b830cacd6312c0809431d4851e623daa7 Apr 21 07:17:27.165026 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.164957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zzsbr_85559148-4ea4-4bfd-8bf0-55be583da361/node-ca/0.log" Apr 21 07:17:27.619117 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.619084 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" event={"ID":"8df52981-4134-4392-b56f-0771cec607de","Type":"ContainerStarted","Data":"9333106a0efcae03ef6c15d2f1fa7e3fd4efd30f33aa0ccc884f9184cd8afa13"} Apr 21 07:17:27.619117 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.619116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" event={"ID":"8df52981-4134-4392-b56f-0771cec607de","Type":"ContainerStarted","Data":"969983ea76799f6d3f7ab6d68b5c042b830cacd6312c0809431d4851e623daa7"} Apr 21 07:17:27.619314 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.619202 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:27.638975 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.638938 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" podStartSLOduration=1.638923533 podStartE2EDuration="1.638923533s" podCreationTimestamp="2026-04-21 07:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:17:27.636981333 +0000 UTC m=+872.669032560" watchObservedRunningTime="2026-04-21 07:17:27.638923533 +0000 UTC m=+872.670974761" Apr 21 07:17:27.967301 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:27.967227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57f7f9fd66-mtt95_b6abfd8a-5d8b-4af8-94a1-95cf455336e0/router/0.log" Apr 21 07:17:28.347861 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.347837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vrfxd_0816ede2-8af6-41c9-b423-5c313bc38315/serve-healthcheck-canary/0.log" Apr 21 07:17:28.694089 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.694010 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w9nvv_44af391c-8f7a-471b-a4eb-25f3b5519c86/insights-operator/0.log" Apr 21 07:17:28.694089 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.694052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w9nvv_44af391c-8f7a-471b-a4eb-25f3b5519c86/insights-operator/1.log" Apr 21 07:17:28.854013 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.853988 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjqc2_a82f45ca-4a3a-421a-9360-c03b95c5ce27/kube-rbac-proxy/0.log" Apr 21 07:17:28.874225 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.874201 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjqc2_a82f45ca-4a3a-421a-9360-c03b95c5ce27/exporter/0.log" Apr 21 07:17:28.894390 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:28.894373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fjqc2_a82f45ca-4a3a-421a-9360-c03b95c5ce27/extractor/0.log" Apr 21 07:17:33.631286 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:33.631259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-59rd2" Apr 21 07:17:34.126778 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:34.126743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4pwc4_c2986c84-0eaa-4d7a-a7c4-5337ab7f4875/kube-storage-version-migrator-operator/1.log" Apr 21 07:17:34.127593 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:34.127575 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4pwc4_c2986c84-0eaa-4d7a-a7c4-5337ab7f4875/kube-storage-version-migrator-operator/0.log" Apr 21 07:17:35.063639 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.063614 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6krhs_08ad7b6d-5db7-4175-a947-75d82fb3d9ef/kube-multus/0.log" Apr 21 07:17:35.317743 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.317709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/kube-multus-additional-cni-plugins/0.log" Apr 21 07:17:35.347080 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.347056 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/egress-router-binary-copy/0.log" Apr 21 07:17:35.369197 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.369170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/cni-plugins/0.log" Apr 21 07:17:35.391398 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.391381 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/bond-cni-plugin/0.log" Apr 21 07:17:35.417287 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.417267 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/routeoverride-cni/0.log" Apr 21 07:17:35.447876 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.447826 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/whereabouts-cni-bincopy/0.log" Apr 21 07:17:35.473305 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.473286 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rlt5l_8ce849fd-7b86-4acc-b03c-5583cbf4cc68/whereabouts-cni/0.log" Apr 21 07:17:35.755118 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.755099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qzgxt_e4d3a3ee-1584-42b6-a403-4bb39d451cab/network-metrics-daemon/0.log" Apr 21 07:17:35.775635 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:35.775620 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qzgxt_e4d3a3ee-1584-42b6-a403-4bb39d451cab/kube-rbac-proxy/0.log" Apr 21 07:17:37.200618 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.200592 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/ovn-controller/0.log" Apr 21 07:17:37.224676 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.224658 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/ovn-acl-logging/0.log" Apr 21 07:17:37.246344 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.246325 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/kube-rbac-proxy-node/0.log" Apr 21 07:17:37.272209 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.272189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:17:37.296310 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.296293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/northd/0.log" Apr 21 07:17:37.322156 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.322136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/nbdb/0.log" Apr 21 07:17:37.350000 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.349982 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/sbdb/0.log" Apr 21 07:17:37.444954 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:37.444897 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt7hd_9890b61f-81d9-4bd9-a0d8-9cbf41de4590/ovnkube-controller/0.log" Apr 21 07:17:38.432213 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:38.432183 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-qpjps_bb10b08b-2c33-4546-889b-697fd8825b2f/check-endpoints/0.log" Apr 21 07:17:38.509673 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:38.509646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qv6dn_5affaa81-79dd-4de7-85b9-98182a2406f0/network-check-target-container/0.log" Apr 21 07:17:39.397439 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:39.397406 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-48872_2bf8d8f9-a085-4c41-8558-9fd3edcddb6f/iptables-alerter/0.log" Apr 21 07:17:40.146041 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:40.146015 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xxdx2_f64f9328-2e8e-457d-ab14-8b16c32be65a/tuned/0.log" Apr 21 07:17:41.790164 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:41.790136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2z97m_ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2/cluster-samples-operator/0.log" Apr 21 07:17:41.807943 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:41.807923 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2z97m_ae72d73c-bcc9-49ba-bd5d-9f02ccfca5c2/cluster-samples-operator-watch/0.log" Apr 21 07:17:42.730742 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:42.730708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-hq6q2_de46750f-df1b-4469-a3bd-4300d5fa0f79/service-ca-operator/1.log" Apr 21 07:17:42.731553 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:42.731536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-hq6q2_de46750f-df1b-4469-a3bd-4300d5fa0f79/service-ca-operator/0.log" Apr 21 07:17:43.481277 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:43.481254 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dps9l_c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4/csi-driver/0.log" Apr 21 07:17:43.504783 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:43.504729 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dps9l_c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4/csi-node-driver-registrar/0.log" Apr 21 07:17:43.528504 ip-10-0-143-69 kubenswrapper[2573]: I0421 07:17:43.528483 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dps9l_c146c4d5-3700-4a77-bb4c-1fb3a2b4a1c4/csi-liveness-probe/0.log"